1 Benoit Baudry – 2000 : Masters degree at the Univ. de Rennes 1 in – june 2003 : PhD thesis, «...

Preview:

Citation preview

1

Benoit Baudry– 2000 : Masters degree at the Univ. de Rennes 1

in– june 2003 : PhD thesis, « Testable assembly

and component validation » with Yves Le Traon and Jean-Marc Jézéquel in the Triskell group at the Univ. de Rennes 1

– Next : Postdoc position at the CEA-Saclay on MDA

2

Test in the Triskell groupTest in the Triskell group

Triskell : Model Driven Engineering for Component Based Software

(http://www.irisa.fr/triskell/welcome.htm)

UML-based OO testing (Yves Le Traon) :– Test order, integration strategies

Vu Le Hanh. -- test et modèle UML : stratégie, plan et synthèse de test. -- PhD thesis

– Test generation from the requirements (Clémentine Nebut)– UML-based test generation

Alain Le Guennec. -- Génie Logiciel et Méthodes Formelles avec UML Spécification, Validation et Génération de tests. -- PhD thesis

– PhD starting on MDA and testing (Franck Fleurey)

Testable assembly and Testable assembly and component validationcomponent validation

Benoit Baudry

Triskell group, IRISA, Rennes, France

Jean-Marc Jézéquel

Yves Le Traon

4

Complex software systems are built with components as the unit for reuse

Trustable components

Techniques to assemble components

5

Trustable componentTrustable componentSpecification

Implementation

V & V: Test cases setTrust based on

consistency

executables contracts

6

Components assemblyComponents assembly•Impact of design by contract•Impact of coupling on a testability factor

7

ContextContext

Software componentsObject oriented design and analysisSpecific structures in OO programs

Need to adapt testing techniques R. V. Binder, "Testing Object-Oriented Systems: Models, Patterns and Tools". Addison-Wesley 1999.

8

SummarySummary

Related work– Software testing– Mutation analysis

Automatic test cases generationDesign by Contract and testingComponent testability

9

Software testingSoftware testing

Two types :– Structural based on implementation– Functional based on specification of functionalities

Objectives– Examine or execute a program searching for errors– Possibly :

robustness performances safety properties

10

Le test de logicielLe test de logicielGenerationTest data

ExecutionTest case

Oracle

Diagnostic

Test criterion

Stop

correct

¬ correct

verified

¬ verified

11

Mutation analysisMutation analysisR. DeMillo, R. Lipton and F. Sayward, "Hints on Test Data Selection : Help R. DeMillo, R. Lipton and F. Sayward, "Hints on Test Data Selection : Help For The Practicing Programmer". IEEE Computer For The Practicing Programmer". IEEE Computer 1111(4): 34 - 41 April 1978.(4): 34 - 41 April 1978.

Technique that aims at validating the quality of a test cases set– Errors injected in the program under test– Compute the proportion of errors detected by

the test cases

12

Mutation analysisMutation analysis

Several error types: mutation operators– Operators defined from analysis of errors sets

observed during developmentJ. Offutt, A. Lee, G. Rothermel, R. H. Untch and C. Zapf, "An Experimental

Determination of Sufficient Mutant Operators". ACM Transactions on Software Engineering and Methodology 5(2): 99 - 118 April 1996.

– Recent work [Ma’02, Alexander’02] propose OO specific operators

13

Mutation analysisMutation analysis

Faulty program : mutant Test cases detect mutants

– Test cases kill mutants

Mutation score– Proportion of killed mutants quality of test cases

Two oracles– Traces difference– Executable contracts

14

Mutation analysisMutation analysis

P Mutants generatio

n

Mutant 1Mutant 2

Mutant 3Mutant 4

Mutant 5Mutant 6

TC

Execution

Killed mutant

Diagnosis

Alive mutant

Incomplete specification

AutomatiqueManuel

Optimiser

Equivalent mutant Deleted from the set of mutants

Add contracts

Insufficient test cases

15

Automatic test cases Automatic test cases generation and optimisationgeneration and optimisation

16

Automatic test cases Automatic test cases optimisationoptimisation

Average mutation score easy to reach– By hand or random generation

Unit testing– Class testing in an OO context

Component testing– Classes assembly with a main interface class

17

Genetic algorithmGenetic algorithmBuild an initial population

Compute the fitness value for each individual Mutation scoreReproduction

Crossover

Mutate one or several individuals

Boucle génétique

Several stop criteria: good mutation score, number of generations…

test_set is do time.set_hour(10) time.set_minute(7) time.set_second(2) time.set(10,7,5) end test_hour_12 is do time.set_hour(0) end test_is_am_pm is do time.set_hour(12) time.set_hour(1) time.set_hour(13) time.set_hour(0) end

test_hour is do time.set_hour(0) time.set_hour(1) end

test_out is do time.set(02,02,02) end

test3 is do time.set(0,0,0) end

test_comparaison is local time1 : P_TIME do !!time1 time.set(15,59,59) time1.set(14,59,59) time1.set(16,00,00) end

test_hour_12 is do time.set_hour(0) end

test_out is do time.set(02,02,02) end

test_set is do time.set_hour(10) time.set_minute(7) time.set_second(2) time.set(10,7,5) end test_hour_12 is do time.set_hour(0) end test_is_am_pm is do time.set_hour(12) time.set_hour(1) time.set_hour(13) time.set_hour(0) end

test_hour is do time.set_hour(0) time.set_hour(1) end

test_out is do time.set(02,02,02) end

test3 is do time.set(0,0,0) end

test_set is do time.set_hour(10) time.set_minute(7) time.set_second(2) time.set(10,7,5) end test_out is do time.set(02,02,02) end

test3 is do time.set(0,0,0) end

test_hour is do time.set_hour(0) time.set_hour(1) end

test_hour_12 is do time.set_hour(0) end test_is_am_pm is do time.set_hour(12) time.set_hour(1) time.set_hour(13) time.set_hour(0) end

test_hour is do time.set_hour(0) time.set_hour(1) end

test_out is do time.set(02,02,02) end

test3 is do time.set(0,0,0) end

test_hour is do time.set_hour(0) time.set_hour(1) end

test_out is do time.set(02,02,02) end

test3 is do time.set(0,0,10) end

18

Genetic algorithmGenetic algorithm

Tools development (Java, C#)– For mutation analysis: JMutator, NMutator– Framework for the genetic algorithm– Test driver

Experiments– Several classes or components

19

C# case studyC# case studyGenetic algorithm with a 2% mutation rate

50

55

60

65

70

75

80

85

90

0 50 100 150 200# generations

mu

tati

on

sc

ore

(%

)

20

Genetic algorithmGenetic algorithm

Fixed size for the test cases setCrossover not much usefulReproduction not efficient to keep memory

21

Bacteriologic algorithmBacteriologic algorithmRosenzweig , “Species diversity in space and time”, CUP,1995Rosenzweig , “Species diversity in space and time”, CUP,1995

Delete :– The notion of individual– Crossover operation

Introduc :– The notion of bacterium a test case– A memory set of good bacteria

22

Bacteriologic algorithmBacteriologic algorithm

Build an initial medium

Compute the fitness value for each bacterium Mutation scoreMemorise the best ones

Mutation

Several stop criteria: good mutation score, number of generations…

Memory

test_out is do time.set(15,59,59) end

test_hour is do time.set_hour(0) time.set_hour(1) end

test3 is do time.set(0,0,0) end test_set is do

time.set_hour(10) time.set_minute(7) time.set_second(2) time.set(10,7,5) end

Bacteriologic medium

test_out is do time.set(15,59,59) end

test_hour is do time.set_hour(0) time.set_hour(1) end

test_out is do time.set(02,02,02) end

test3 is do time.set(0,0,0) end

test_out is do time.set(15,59,59) endtest_out is do time.set(15,60,59) end

23

Bacteriologic algorithmBacteriologic algorithm

Tools development (Java, C#)Experiments

– Several case studies– Tuning / validating the model

24

C# case studyC# case study

50

55

60

65

70

75

80

85

90

95

100

0 5 10 15 20 25 30# generations

mu

tati

on

sco

re(%

)

25

ResultsResults

Original algorithm for automatic test cases generation

Tools developmentWork in progress with new programs and

new fitness functions

26

Design by contract for Design by contract for robustness and diagnosabilityrobustness and diagnosability

27

Design-by-contract™Design-by-contract™

A design method for OO components (B. Meyer) Boolean assertions:

– pre et post conditions for each method– invariants for global properties

A broken contract indicates the presence of a bug:– Precondition violated a client has broken the contrat– Postcondition violated an error in the method

28

Two quality criteriaTwo quality criteria

Robustness– Ability for a component to detect a faulty

internal state

Diagnosability– effort for the localization of a fault and the

preciseness allowed by a test strategy on a given system, knowing there is a fault

29

RobustnessRobustness

A Local robustness ability of contracts to detect errors

Combination is better than addition

Global robustness

B

C

A

contracts

Det(A,C)

30

RobustnessRobustness

Global robustness for a components assembly depends on:– Local robustness of components– The Det(i,j) probability a component i detects

errors in jMutation analysis test cases set with a

100% mutation score local robustness mutation score of contracts Det(i,j) probability mutation score for contracts in i with mutants of component j

31

RobustnessRobustness

0

0,2

0,4

0,6

0,8

1

0 0,2 0,4 0,6 0,8 1isolated robustness

glo

bal

ro

bu

stn

ess

SMDS

InterView

Python

32

Diagnosis scope

Classical software

DiagnosabilityDiagnosability

Disgnosis scope

Software designed

by contracts E

xce

ptio

n h

and

ling

33

DiagnosabilityDiagnosability

0 0,2 0,4 0,6 0,8 1

Contracts/assertions density

Dia

gno

sab

ility

0

100

200

300

400

500

600

700

800

900

1000

0.2

0.4

0.6

0.8

Contracts/assertions efficiency

34

ResultsResults

Qualitative study of design-by-contractSummary

Adding contracts, even if they are weak, improves the component’s quality

Efficiency of contracts has more impact than the density

35

Testability anti-patterns in a Testability anti-patterns in a UML class diagramUML class diagram

36

Testability of OO softwareTestability of OO software

Control is distributed– Numerous interactions between objects

Ambiguities in the design can lead to hard points for testing

Class diagramTest criterionDetecting / deleting testability anti-

patterns

37

ExampleExampleBuddyState

Connected BuddyICQ

ICQDirectProtocol

BuddyAIM

Buddy

NonConnected

AIMDirectProtocol

1

-currentState

1

*

1

«interface»DirectProtocol

Clientstate

Connected NonConnected

«interface»IndirectProtocol

ICQIndirectProtocol AIMIndirectProtocol

*

1

Client

1 1 1 1

38

ExampleExampleBuddyState

Connected BuddyICQ

ICQDirectProtocol

BuddyAIM

Buddy

NonConnected

AIMDirectProtocol

1

-currentState

1

*

1

«interface»DirectProtocol

Clientstate

Connected NonConnected

«interface»IndirectProtocol

ICQIndirectProtocol AIMIndirectProtocol

*

1

Client

1 1 1 1

39

ExampleExampleBuddyState

Connected BuddyICQ

ICQDirectProtocol

BuddyAIM

Buddy

NonConnected

AIMDirectProtocol

1

-currentState

1

*

1

«interface»DirectProtocol

Clientstate

Connected NonConnected

«interface»IndirectProtocol

ICQIndirectProtocol AIMIndirectProtocol

*

1

Client

1 1 1 1

40

ExampleExampleBuddyState

Connected BuddyICQ

ICQDirectProtocol

BuddyAIM

Buddy

NonConnected

AIMDirectProtocol

1

-currentState

1

*

1

«interface»DirectProtocol

Clientstate

Connected NonConnected

«interface»IndirectProtocol

ICQIndirectProtocol AIMIndirectProtocol

*

1

Client

1 1 1 1

41

ExampleExampleBuddyState

Connected BuddyICQ

ICQDirectProtocol

BuddyAIM

Buddy

NonConnected

AIMDirectProtocol

1

-currentState

1

*

1

«interface»DirectProtocol

Clientstate

Connected NonConnected

«interface»IndirectProtocol

ICQIndirectProtocol AIMIndirectProtocol

*

1

Client

1 1 1 1

42

Anti-patternsAnti-patterns

Two anti-patterns on the design :– Self-usage

BuddyState Buddy1

-currentState

1

43

Anti-patternsAnti-patterns

– Class interaction BuddyState

NonConnected BuddyAIM

AIMDirectProtocol

BuddyICQ

Buddy

Connected

ICQDirectProtocol

1

-currentState

1

*

1

«interface»DirectProtocol

44

Improve testabilityImprove testability

Add preciseness on the design to make it closer to implementation

Refactoring for testability– Use interfaces when possible

Use stereotypes– Specify the roles of associations : consult,

create, modify

45

Improve testabilityImprove testabilityClient

MotifWidgetFactory PMWidgetFactory

Window

PMWindow MotifWindow

ScrollBar

PMScrollBar MotifScrollBar

«interface»WidgetFactory

1 *

*

1

*

46

Improve testabilityImprove testabilityClient

MotifWidgetFactory PMWidgetFactory

Window

PMWindow MotifWindow

ScrollBar

PMScrollBar MotifScrollBar

«interface»WidgetFactory

1 *

*

1

*

47

Improve testabilityImprove testabilityClient

MotifWidgetFactory PMWidgetFactory

Window

PMWindow MotifWindow

ScrollBar

PMScrollBar MotifScrollBar

«interface»WidgetFactory

1 *

*

1

*

48

Improve testabilityImprove testabilityClient

MotifWidgetFactory PMWidgetFactory

Window

PMWindow MotifWindow

ScrollBar

PMScrollBar MotifScrollBar

«interface»WidgetFactory

1 *

*

1

*

«create» «create»

49

Methodology for testabilityMethodology for testability

Class diagram

Improve the design

Refuse the design

Accept and implement the

design

Test

Testability analysis

Check properties on

implementation

50

ResultsResults

Testability analysis on the designMethodology to improve the class diagramCatalogue for the testability of design

patterns

51

ConclusionConclusion

Work for the validation and design of software components– Algorithms for test generation– Models to measure the quality of components

Test toolsPublications (JSS, ISSRE, Metrics, ASE…)

52

ApproachApproach

0 1

•Mutation analysis•Evolutionist algorithm•Design by contract•Testability

« Testing can prove the presence of bugs, but never their absence »Dijkstra

53

Future workFuture work

Test generation for efficient diagnosis– Diagnosis algorithms– Test criteria and bacteriologic algorithm

Mutation analysis for securityDesign by contract for test oracleTestability of design patterns

– The addition of stereotypes is seen as a model tranformation

Recommended