Upload
nelson-jeffery-peters
View
216
Download
2
Tags:
Embed Size (px)
Citation preview
Software testingSoftware testing
LectureLecturep8p8
T120B029T120B02920201212 pavasario sem. pavasario sem.
2
Motivation
“ Testing is the process ofexecuting a program with
theintent of finding errors”
3
Limits of software testing
• “Good” testing will find bugs• “Good” testing is based on requirements,
i.e. testing tries to find differences between the expected and the observed behavior of systems or their components
• BUT: Testing can only prove the presence of bugs - never their absence
4
Test Specification - Manual Process
Manual Manual Manual
Test Document
Text-based
Inspection Test Cases
5
Semi-Automated Process
Manual
Semi- Automated Automated
Test Document
Model-based
Inspection Test Cases
Consistency SyntaxSemantic
6
The V-modelRequirements Definition
Functional system Design
Technical system Design
Component Specification
Implementation
Acceptance Level Testing
System Level Testing
Integration Level Testing
Unit Level Testing
Constru
ction p
hase
ss
Test
ing p
hase
ss
I
II
IV
III
7
The W-modelI
II
III
IV
Implementation
Acceptance Level Testing
System Level Testing
Integration Level Testing
Unit Level Testing
Constru
ction p
hase
ss
Preparation of Acceptance Level Testing
Preparation of System Level TestingPreparation of Integration Level Testing
Preparation of Unit Level Testing
Debuging
8
Using UML and UTP in system development
UML
UML
UML
UML
UTP
UTP
UTP
UTP
TTCN
TTCN
Junit&TTCN
JUnit
Debuging
Debuging
Debuging
Debuging
Implementation Changes
I
II
III
IV
9
Exhaustive testing
There are 1014 possible paths! If we execute one test per millisecond, it would take 3.170 years to test this program!! Out of question
loop < 20x
10
Testability = how easy can a program be tested
Features that help/influence testing:• operability - it operates cleanly• observability
• the results are easy to see• distinct output is generated for each input• incorrect output is easily identified
• controllability• processing can be controlled• tests can be automated & reproduced
• decomposability - software modules can be tested independently
• simplicity - no complex architecture and logic• stability - few changes are requested during testing• understandability - program is easy to understand
11
Who tests the software
Understands the systembut, will test “gently”and, is driven by “delivery”
Must learn about the system,but, will attempt to break itand, is driven by quality
developer independent tester
12
Test cases
• Describe how to test a system/module/function
• Description must identify– system state before executing the test– function to be tested– parameter values for the test– expected outcome of the test
Some automation should be possible after defining them!
13
Black box vs. white box testing
requirements
input
events
output
14
White box testing
…our goal is to ensure that all statements and conditions have been executed at least once...
15
White box testing:Why cover all paths?
• logic errors and incorrect assumptions are inversely proportional to a path’s execution probability
• we often believe that a path is not likely to be executed; in fact, reality is often counter intuitive
• typographical errors are random; it’s likely that untested paths will contain some
16
Guaranteed cover:Exhaustive testing
But:There are 520=1014 (approx.) possible paths!
loop < 20x
If-then-else
17
Better:Selective testing
a selected path
loop < = 20x
18
Path coverage testing
1) Derive a logical complexity measure
2) Use it to define a basis set of execution paths
First, we compute the cyclomatic complexity V(G):
number of transitions - number of nodes +2
In this case, V (G) = 4
V(G) provides an upper bound of tests that must be executed to guarantee coverage of all program statements
19
Path coverage set• Cyclomatic complexity defines the
number of independent paths in the basis set
• Path coverage set = set of paths that will execute all statements and all conditions in a program at least once
• Goal: Define test cases for basis set• Path coverage set is not unique!
20
Path coverage testing
1
2
4
7
8
3
65
Next, we derive the independent paths:
Since V(G) = 4, there are four paths
Path 1: 1,2,3,6,7,8
Path 2: 1,2,3,5,7,8
Path 3: 1,2,4,7,8
Path 4: 1,2,4,7,2,4…7,8
Finally, we derive test cases to exercise these paths, i.e. choose inputs that lead to traversing the paths
21
Documenting test cases
• Name• Number• Description of system state before
running the test case• Values for the inputs• Expected outputs• Short description (if needed)
22
Path coverage testing - remarks
• you don’t need a flow chart, but the picture will help when you trace program paths
• each connection between boxes counts as 1 transition
• path coverage testing should be applied to critical modules
Connects four boxes counts as three transitions
23
Black box testing
requirements
input
events
output
24
Categories of errors in
black box testing
• incorrect or missing functions• interface errors• errors in data structures or external
database access• performance errors• initialization and termination errors
25
Example
abs(integer x)Return x if x >= 0Return -x if x<0
• The concrete value of x does not matter! Basis for Equivalence Testing
26
Loops
• Cornerstone of every program• Loops can lead to non-
terminating programs
27
Loop testingsimple loop
nested loops
concatenated loops
unstructured loops
28
Loop testing: simple loops
Minimum conditions - simple loops1. skip the loop entirely2. only one pass through the loop3. two passes through the loop4. m passes through the loop m < n5. (n-1), n, and (n+1) passes through the loop
where n is the maximum number of allowable passes
29
Nested loops• Just extending simple loop testing:
number of tests explodes• Reduce the number of tests:
– start at the innermost loop; set all other loops to minimum values
– conduct simple loop test; add out of range or excluded values
– work outwards while keeping inner nested loops to typical values
– continue until all loops have been tested
30
Equivalence class testing Equivalence partitioning
userqueries
numericaldata
output format requests
responsesto prompts
command key input
mouse picks on menu
Partitioning is basedon input conditions
31
Equivalence classes (1)
• Valid data– user supplied commands– responses to system prompts– file names– computational data
physical parametersbounding valuesinitiation values
– output data formatting commands– responses to error messages– graphical data (e.g., mouse picks)
32
Equivalence classes (2)
• Invalid data– data outside bounds of the program– physically impossible data– proper value supplied in wrong place
33
Defining equivalence classes
• Input condition is a range: one valid and two invalid classes are defined
• Input condition requires specific value: one valid and two invalid classes are defined
• Input condition is boolean: one valid and one invalid class are defined
Then define one test case for each equivalence class
34
Automating software testing
• Manual software testing is time consuming
• Software testing has to be repeated after every change (regression testing)
• Write test drivers that can run automatically and produce a test report
35
Testing OO software:general remarks
• Based on UML specifications– Use cases– Class diagrams– State transition diagrams– …
• Problem: Effort
focus on requirementsnot comprehensive quality?
36
Scenario based testing
• Concentrates on (functional) requirements
• Based on – use cases – corresponding sequence diagrams
37
Test Requirements• Every use case• Every fully expanded extension
(<<extend>>) combination• Every fully expanded uses
(<<uses>>) combination• Tests normal as well as exceptional
behavior
38
Example
39
Scenarios for ATM Example
40
Test Procedure (1)• Establish testing budget• Rank Use Cases (& variants) according
to– Relative frequencies– Criticality
• Allocate #test cases to each use case (and possibly variants)
• Develop test cases for scenarios
41
Testing a use case/scenarios (1)
• A scenario is a path through a sequence diagram
• There are many different scenarios associated with a sequence diagram!
42
What can be wrong?• Incorrect or missing output• Missing function/feature in an object• Incorrect parameter values boundary value
analysis• Correct message - wrong object• Incorrect message - right object• Incorrect object state• Message sent to destroyed object• Incorrect exception thrown• Incorrect exception handling• Incorrect assumption about receiver’s class
– Class casts
43
Testing a use case/scenarios (2)
• All paths in sequence diagram should be executed– Focus on messages from the actor/UI to the
system– If possible: check “internal” objects
• Extract control flow information from a sequence diagram – Test branches and loops– Test exceptions– Consider state
44
Example: Establish Session
<4]
45
Example (cont.)• Several checks are performed
– Is the card an ATM card?– Is the card stolen?– Has the customer a valid bank account?– Has the customer a valid PIN?
• Three chances to enter the PIN• Service charge if from different bank
46
Test Procedure (2)
• Translate the sequence diagram into a flow graph– Each message or group of consecutive
messages becomes a segment (box). – A conditional action (branch, iteration)
ends a segment– Each bracketed (loop) condition
becomes a decision (hexagon)
47
Establish Session
48
Remarks• White-box testing can be applied
to flow graph• Control flow graph may reveal
– ambiguities and omissions in the sequence diagram
• Ambiguities either reveal faults or a behavior to be interpreted by the coder
49
Path conditions• Input and object state need to
satisfy the path conditions• Identify predicates on each path,
work from first predicate and identify required input and state variable values
• Do the same for the next path conditions until the last one
• Some paths may not be feasible
50
Example: Path Conditions
51
Special cases• Sequence diagrams rarely include
two significant details: – Polymorphism: receiver has to be of a
specific type but not an instance of one specific class•If needed: check instances of different classes
– Exception-handling: “jumping out” of the sequence diagram•Catch exception
52
Test Procedure (3)
• Develop the Use Case / Class coverage matrix
• Analyze matrix– Which classes are not covered by test
cases?– Which methods of a class are not
covered?
• Create additional test cases
53
Use case / class coverage matrix
ResourcePool Resource …newResource deleteResource … setname …. …
use case 1 X Xuse case 2 X…
54
Test case document
1. Usual cover information2. Use case test descriptions3. References4. Appendices
55
Test Procedure (4)• Define test case
– Name– Unique number– Textual description of test procedure
• Based on flow graph and conditions• Includes expected results after each step
define condition that allows todetermine if the step succeededor failed
56
Use case test descriptions
• For every use case/sequence diagram in the design document– a test case that shows that the scenarios and
its variations can be executed by the system• describe as precondition the objects and
their state that get messages from the UI• As steps: list the sequence of messages
that are send from the UI to other objects
– test cases for all exceptions of the use case
57
Example: Test case definition (1)
Test “Establish session”UID: 0001Description:
Precondition: ATM is running and no sessions exist…
58
Example: Test case definition (2)
…Step 1: test “begin session”
To execute: Create a session objectExpected result: session object exists and is in its
initial state
Step 2: test “no ATM card was entered”Precondition: Card entered is no ATM cardTo execute: Read cardExpected result: NoATMCard exception is thrown
and session object was deleted
…
59
Example: Test case definition (3)
…Step 3: test “get PIN”
Precondition: Card entered is ATM cardTo execute (a): displayEnterPINExpected result: “Enter PIN” is displayedTo execute (b): getEntryExpected result: 4 digit entry of a stolen cardTo execute (c): checkCardExpected result: StolenCardException is thrown…
60
Example: Test case definition (4)
…To execute (d): getEntryExpected result: 4 digit entry of invalid accountTo execute (e): checkCardExpected result: falseTo execute (g): getEntryExpected result: 4 digit entry of valid accountTo execute (h): checkCardExpected result: true…
61
Executing tests• Develop automatic test drivers
– In assignment: test driver should run on CLIENT side to make sure that the client/server communication is properly implemented
• Use case tests– develop test classes that execute the use
cases without user involvement
• Test the use cases manually with UI• System test driver calls all use case test
classes
62
Implementing test cases
• Test driver code must– create the system state described in the
test case– call the method to test– check if the expected result is produced– print test result to file/console e.g.– count the number of failed test to produce
a summary result at the end
63
Test driver (1)package atm.impl.test;public class ATMUseCaseTest {
private int allErrors = 0; //count #errors
private void usecase1 () { //test use case 1try{
System.out.println(“========“);System.out.println(“Testing Use Case 1: Establish Session”);testStep1();testStep2();testStep3();…System.out.print(“Establish Session: Errors found = ”, allErrors);System.out.print(“Finished testing Use Case 1: Establish Session”);}
catch (Exception anExc) {System.out.println(“Unexpected exception “, anExc, “in Use Case 1”); allErrors = allErrors +1;
System.out.print(“Establish Session: Errors found = ”, allErrors); System.out.println(“Testing Use Case 1: Establish Session”);}
}…
64
Test driver (2) private void testStep1() {
try{//test step 1 of use case 1System.out.print(“Testing Step 1: Begin Session ”);
//create precondition/state//execute testSession aSession = new Session();//check postcondition and print resultif (aSession == null) {System.out.println(“failed”); allErrors = allErrors + 1;}else{System.out.println(“succeeded”);}}
catch (Exception anExc) {System.out.println(“Unexpected exception “, anExc, “in Step 1”); allErrors = allErrors + 1;}
}…
65
Tools to use(1)
Test Design Test
Specificacion
Test Result Analysis
TestExecutation
TestValidation
UTP
66
Tools to use(2)
•Test design•Test specification•Test validation•Test execution•Test result analysis•Test generation•System model validation
67
Tools to use(3)
•Test management tools•Monitoring tools•Defect tracking tools•Static analysis tools•Metrics tools
68
Tools to use(4)
Test Management Tool
Converted to UTP format
Serialized using XMI Encoder
Test Execution Tool
Converted from UTP format
Serialized using XMI Decoder
Abstraction Layer