Upload
marietta-anglim
View
31
Download
0
Tags:
Embed Size (px)
DESCRIPTION
ISPRAS Experience in Model Based Testing. A lexander K. Petrenko , Institute for System Programming of Russian Academy of Sciences (ISPRAS), http://www.ispras.ru. ISPRAS Experience in Industrial Model Based Testing. Why Model Based Testing?. - PowerPoint PPT Presentation
Citation preview
19.09.2002 Intel Academic Forum. Budapest, September, 2002
ISPRAS Experience in Model Based Testing
Alexander K. Petrenko,Institute for System Programming of Russian Academy of Sciences (ISPRAS),http://www.ispras.ru
19.09.2002 Intel Academic Forum. Budapest, September, 2002
ISPRAS Experience in Industrial Model Based Testing
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Why Model Based Testing?
Exhaustive testing that covers all implementation paths is impossible. Exhaustive implementation based (“white box”) testing does not guaranty
correct functionality. White box testing leads to increasing duration of development because test
development can be launched only when the implementation is completed.
Nevertheless, we like to conduct systematic testing.Formal models propose basis for systematic testing, we derive from the models
test coverage metrics, input stimulus, results correctness criteria.
test development ahead of implementation schedule.
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Model Checking vs. Model Based Testing
Model Checking Model Based Testing
Answers the question
Is the model correct? Does the implementation behavior conform to the model behavior?
Expected result Correct model Test suite for the implementation testing and proper implementation
Complexity of the models
More simple in comparison with implementation because restriction of analytic analysis
Close to complexity of implementation under test
Relation between model and implementation
Very complicated Simple
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Synonyms
Models (Formal) Specification
We consider behavior/functional models. The models provide simplified, abstract view on the target software/hardware.
Processing of the models needs their formal description/specification.
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Model Based Testing Approach
Generate exhaustive test suites for model of implementation
Translate the test suites to implementation level
Apply the tests to implementation under test
(Optionally) Interpret the testing results in terms of the model
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Related Works
IBM Research Laboratory (Haifa, Israel) Microsoft Research (Redmond, US)
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Examples of Model Based Testing Applications IBM Research Laboratory (Haifa, Israel)
Store Date Unit – digital signal processor API of file system, telephony and Internet protocols etc.
Microsoft Research (Redmond, US) Universal PnP interface
ISPRAS (Moscow, Russia) Kernel of operating system (Nortel Networks) IPv6 protocol (Microsoft) Compiler optimization units (Intel) Massive parallel compiler testing (RFBR, Russia)
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Origin of ISPRAS Methods
1987-1994Test suite for compiler of real-time programming language for “Buran” space shuttle
1994 – 1996ISP RAS – Nortel Networks contract onfunctional test suite development for Switch Operating System kernel Few hundreds of bugs found in the OS kernel, which had
been 10 years in use KVEST technology
About 600K lines of Nortel code tested by 2000
Intel Academic Forum. Budapest, September, 2002
19.09.2002
ISPRAS Model Based Testing: Two Approaches
UniTesKTesting of Application Program Interfaces (API) based on Software Contract
LamaCompiler testing based on LAnguage Model Application (Lama)
19.09.2002 Intel Academic Forum. Budapest, September, 2002
UniTesK
Testing of Application Program Interfaces (API)
Intel Academic Forum. Budapest, September, 2002
19.09.2002
What is API?
User Interface
Application Program Interface
(API)
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Functional Testing
UniTesK method deals with functional testing
Requirements Formal Specifications
Tests
To automate testing we provide a formal representation of requirements
Intel Academic Forum. Budapest, September, 2002
19.09.2002
UniTesK ProcessPhases Techniques
Pre- and post-conditions, invariants
Implicit Finite State Machines (FSM), data iterators
Test coverage metrics based on specification structure
Interfacespecification
Test scenariodescription
Test execution
Test result analysis
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Decomposition of Testing Tasks
The entire test is a test sequence intended to achieve specified coverage
From specification we can generate oracles and define test coverage metrics
Test sequence construction
Test oracles
System under test
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Test Suite Architecture
Legend:
Automatic derivation
Pre-builtPre-builtManualManual GeneratedGenerated
SpecificationSpecification Test coverage trackerTest coverage trackerTest oracleTest
oracleData modelData model
System under testMediatorMediator Java/C/C++/C# mediator
Java/C/C++/C# mediator
Test scenarioTest scenario Scenario driverScenario driver Test engineTest engine
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Test Oracle
Specification of method f
integer f (a : float)post{
post_f (a, f_result)
}
Test Oracle for the method f
f_result = f(x)
post_f(x,f_result)
verdict = true
verdict = false
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Test Coverage Metrics Based on Specification Structure
Specification post{
if ( a || b ||c || d&& e )
{ branch “OK“; ..... }else { branch “Bad parameters" ; ..... }
}
Partition
(Derivation of branches and
logical terms)
BRANCH “OK”a -- op1!a && b -- op2!a && !b && c -- op3...BRANCH “Bad parameters" !a && !b && !c && !d!a && !b && !c && d && !e
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Test Sequence Generation
We use FSM to generate test sequences which traverse all equivalence classes defined by partition analysis.
S1
S4
S2
S3
op2
op3
op3
op3
op2
op1
op3
But full FSM description is a labor consuming and tedious work.
Intel Academic Forum. Budapest, September, 2002
19.09.2002
SC1
SC4
SC2
SC3
Equivalenceclasses of states
FSM Construction. Statics
SC1
SC4
SC2
SC3
op2
op3
op3
op3
op2
op1
op3
Partition (Branches and logical terms)
BRANCH “OK”a -- op1!a && b -- op2!a && !b && c -- op3...
BRANCH "Bad parameters" !a && !b && !c && !d -– opi
!a && !b && !c && d && !e -– opi+1
Partition (Branches and logical terms)
BRANCH “OK”a -- op1!a && b -- op2!a && !b && c -- op3...
BRANCH "Bad parameters" !a && !b && !c && !d -– opi
!a && !b && !c && d && !e -– opi+1
First step of FSM construction:- state and transition partition based on
pre- and post-condition structure(FSM factorization)
- test input iterators
Intel Academic Forum. Budapest, September, 2002
19.09.2002
FSM Construction. DynamicsSecond step of FSM construction
SC1
SC4
SC2
SC3
op21
op3
op3op3
op2
op1
op3
SC1
SC4
SC2
SC3
op2
op3
op3
op3
op2
op1
op3
Result of test execution
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Model Based Testing: problems of deployment
1994 – 1996ISP RAS – Nortel Networks contract onfunctional test suite development for Switch Operating System kernel Few hundreds of bugs found in the OS kernel, which had
been 10 years in use KVEST technology
About 600K lines of Nortel code tested by 2000
KVEST had been deployed only in Nortel’s regression testing process.
Why?
Only few formal techniques used in real life practice.
Why?
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Problems of Model Based Testing Deployment
Problem UniTesK solution
Formal models for analytical verification are too simple for test generation.
Ratio of models (specifications) and implementation size is about 1:5-10.
Mediators provide a bridge between abstract models and implementation.
Executable models cannot provide test oracles in common case because dependence on implementation and indeterminism.
Implicit specifications (pre- and post-conditions) provide the test oracles.
Test sequence generation needs very huge models (for example, like FSM).
Implicit FSM. Usual number of states is about 5-20.
How to estimate test quality without implementation test coverage?
Structure of pre- and post-condition is very informative and quite simple for the test coverage metrics.
Gap between formal techniques and software/hardware development practice.
Usual programming languages are extended for specification purpose.
Intel Academic Forum. Budapest, September, 2002
19.09.2002
UniTesK Tools and Applications
CTesK – C testing tool -- alpha version Microsoft IPv6 implementation
J@T – Java testing tool -- beta version Partially tested by itself API of parallel debugger of mpC IDE
(mpC is a parallel extension of C) Posix/Win32 File I/O subsystem
VDM++TesK -- freeFurther steps: C#TesK and C++TesK,
(conceivably) VHDLTesK
19.09.2002 Intel Academic Forum. Budapest, September, 2002
Lama
Compiler testing based on Language Models Application
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Pilot project under contract with Intel
Model Based Testing of Compiler Optimization Units
Automate optimization unit test generation: Improve test coverage of the units Automate test oracle problem
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Lama approach
Lama stands for Compiler testing based on LAnguage Models Application.
Lama process steps:Given: a programming language (PL)
Invent a model (simplified) language (ML) of PL Generate a set of test “programs” in the ML Map ML test “programs” into PL test programs Run compiler (or a compiler unit) to process test program in PL and
analyze correctness of the compiler results
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Process of optimization unit testing
Model languagebuilding blocks Model languagebuilding blocks
Faults &test coverage reports
Faults &test coverage reports
Optimization backgroundOptimization background
Program Language (PL)Specification
Program Language (PL)Specification
Model language design
Model language design
Step 1
Iterator development
Iterator developmentStep 2
Mapper development
Mapper development
Step 3
Step 4
Test “programs” in MLTest “programs” in ML
Test programs in PLTest programs in PL
Test Execution & Test result
analysis
Test Execution & Test result
analysis
Intel Academic Forum. Budapest, September, 2002
19.09.2002
An Example: Common Subexpression Elimination Optimization
Label
. . .Instruction
InstructionInstruction
Transition to label
Basic block
IF instruction
if condition then block else block
Common subexpression
Step 1
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Result of translation into C
Step 3
if ( (('c' - 'a') + (('c' - 'a') > ('c' - 'a'))) ) { (('c' - 'a') + (('c' - 'a') < ('c' - 'a'))); } else { (('c' - 'a') + (('c' - 'a') >= ('c' - 'a'))); }
19.09.2002 Intel Academic Forum. Budapest, September, 2002
Conclusion
Intel Academic Forum. Budapest, September, 2002
19.09.2002
Conclusion on UniTesK&Lama
Both UniTesK and Lama follow model based testing approach
Base idea: Testing complex software by means of exhaustive coverage of relatively simple models
Area of applicability: Any software and hardware components with well-defined interfaces or functional properties
Intel Academic Forum. Budapest, September, 2002
19.09.2002
References1. A. K. Petrenko, I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin.
UniTesK Test Suite Architecture// Proceedings of FME’2002 conference, Copenhagen, Denmark, LNCS, No. 2391, 2002, pp. 77-88.
2. A.Petrenko. Specification Based Testing: Towards Practice// VI Ershov conference proc., LNCS 2244, 2001.
3. A. K. Petrenko, Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Experiences in using testing tools and technology in real-life applications. Proceedings of SETT’01, Pune, India, 2001.
4. I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin. Using Finite State Machines in Program Testing// Programming and Computer Software, Vol. 26, No. 2, 2000, pp. 61-73 (English version).
5. I. Bourdonov, A. Kossatchev, A. Petrenko, and D. Galter. KVEST: Automated Generation of Test Suites from Formal Specifications// Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, No. 1708, 1999, pp. 608-621.
6. I. B. Bourdonov, A. S. Kossatchev, V. V. Kuliamin, A. V. Maximov. Testing Programs Modeled by Nondeterministic Finite State Machine. (www.ispras.ru/~RedVerst/ , white papers).