Click here to load reader

PARAMETERIZED VALIDATION OF UML-LIKE abhik/Students/Ankit.pdf PARAMETERIZED VALIDATION OF UML-LIKE MODELS FOR REACTIVE EMBEDDED SYSTEMS ANKIT GOEL (B.Tech. (Hons.), IT-BHU, India)

  • View
    1

  • Download
    0

Embed Size (px)

Text of PARAMETERIZED VALIDATION OF UML-LIKE abhik/Students/Ankit.pdf PARAMETERIZED VALIDATION OF UML-LIKE...

  • PARAMETERIZED VALIDATION OF

    UML-LIKE MODELS FOR REACTIVE

    EMBEDDED SYSTEMS

    ANKIT GOEL

    (B.Tech. (Hons.), IT-BHU, India)

    A THESIS SUBMITTED FOR THE DEGREE OF DOCTOR OF PHILOSOPHY

    DEPARTMENT OF COMPUTER SCIENCE

    NATIONAL UNIVERSITY OF SINGAPORE

    2009

  • ii

  • CONTENTS iii

    Contents

    I Introduction 1

    1 Introduction 3

    1.1 The Problem Addressed in this work . . . . . . . . . . . . . . . . . . 5

    1.2 Solution Proposed in this dissertation . . . . . . . . . . . . . . . . . . 7

    1.3 Contributions of this thesis . . . . . . . . . . . . . . . . . . . . . . . . 9

    1.4 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . 10

    II Modeling Notations 12

    2 Related Work 13

    2.1 State-based models . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

    2.2 Scenario-based Models . . . . . . . . . . . . . . . . . . . . . . . . . . 17

    2.2.1 Analysis of MSC Specifications . . . . . . . . . . . . . . . . . 19

    2.2.2 Realizability and Implied Scenarios . . . . . . . . . . . . . . . 20

    2.2.3 Scalability of MSC Specifications . . . . . . . . . . . . . . . . 22

    2.2.4 Other Notations . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    2.3 Parameterized System Verification . . . . . . . . . . . . . . . . . . . . 24

    2.4 Model Checking and Data Abstraction . . . . . . . . . . . . . . . . . 26

    2.5 The Semantics of a Class . . . . . . . . . . . . . . . . . . . . . . . . . 27

    3 Interacting Process Classes (IPC) 29

  • iv CONTENTS

    3.1 The Modeling Language . . . . . . . . . . . . . . . . . . . . . . . . . 30

    3.2 Modeling A Rail-Car System: The First-Cut . . . . . . . . . . . . . . 36

    3.3 Concrete Execution Semantics . . . . . . . . . . . . . . . . . . . . . . 39

    3.4 Abstract Execution Semantics . . . . . . . . . . . . . . . . . . . . . . 44

    3.4.1 Abstract Execution of Core Model . . . . . . . . . . . . . . . 46

    3.4.2 Dynamic Process Creation/Deletion . . . . . . . . . . . . . . . 52

    3.5 Associations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

    3.5.1 Modeling Static and Dynamic Associations . . . . . . . . . . . 55

    3.5.2 Concrete execution of IPC models with associations . . . . . . 56

    3.5.3 Abstract execution of IPC models with associations . . . . . . 60

    3.6 Exactness of Abstract Semantics . . . . . . . . . . . . . . . . . . . . . 64

    3.6.1 Over-Approximation Results . . . . . . . . . . . . . . . . . . . 65

    3.6.2 Spurious abstract executions . . . . . . . . . . . . . . . . . . . 67

    3.6.3 Detecting spurious abstract executions . . . . . . . . . . . . . 70

    3.7 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

    3.7.1 Modeled Examples . . . . . . . . . . . . . . . . . . . . . . . . 73

    3.7.2 Use Cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

    3.7.3 Timing and Memory Overheads . . . . . . . . . . . . . . . . . 77

    3.7.4 Checking for spurious execution runs . . . . . . . . . . . . . . 79

    3.7.5 Debugging Experience . . . . . . . . . . . . . . . . . . . . . . 80

    3.8 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

    4 Symbolic Message Sequence Charts (SMSC) 85

    4.1 Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

    4.1.1 Visual Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

    4.1.2 Abstract Syntax . . . . . . . . . . . . . . . . . . . . . . . . . . 93

    4.2 CTAS Case Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98

    4.3 Process Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

    4.3.1 Configurations and Concrete Semantics . . . . . . . . . . . . . 103

  • CONTENTS v

    4.3.2 Semantic Rules and Bisimulation . . . . . . . . . . . . . . . . 105

    4.4 Abstract Execution Semantics . . . . . . . . . . . . . . . . . . . . . . 110

    4.4.1 Translating SMSCs to process terms . . . . . . . . . . . . . . 111

    4.4.2 Representing/Updating Configurations . . . . . . . . . . . . . 112

    4.4.3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

    4.4.4 Properties of SMSC Semantics . . . . . . . . . . . . . . . . . . 122

    4.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

    4.6 Associations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127

    4.6.1 Case Study– A Course Management System . . . . . . . . . . 129

    4.6.2 Association constraints . . . . . . . . . . . . . . . . . . . . . . 130

    4.7 Abstract execution semantics with Associations . . . . . . . . . . . . 133

    4.7.1 Association Insert . . . . . . . . . . . . . . . . . . . . . . . . . 136

    4.7.2 Association Check/Delete . . . . . . . . . . . . . . . . . . . . 142

    4.7.3 Default case . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149

    4.8 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149

    5 IPC vs SMSC 151

    5.1 Local vs Global control . . . . . . . . . . . . . . . . . . . . . . . . . . 151

    5.2 Granularity of Execution . . . . . . . . . . . . . . . . . . . . . . . . . 152

    5.3 Lifeline Abstraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153

    5.4 Which is more expressive? . . . . . . . . . . . . . . . . . . . . . . . . 154

    III Model-based Test Generation 157

    6 Testing: Related Work 161

    6.1 State-based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161

    6.2 Scenario-based . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163

    6.3 Combined notations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164

    6.4 Symbolic Test Generation . . . . . . . . . . . . . . . . . . . . . . . . 164

  • vi CONTENTS

    7 Test Generation from IPC 165

    7.1 Case Study – MOST . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

    7.2 Meeting Test Specifications . . . . . . . . . . . . . . . . . . . . . . . 174

    7.2.1 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . 174

    7.2.2 A∗ search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175

    7.2.3 Test generation Algorithm . . . . . . . . . . . . . . . . . . . . 179

    7.3 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 181

    8 Test Generation from SMSC 187

    8.1 Test-purpose specification . . . . . . . . . . . . . . . . . . . . . . . . 190

    8.1.1 CTAS Case Study . . . . . . . . . . . . . . . . . . . . . . . . . 191

    8.1.2 Test-purpose Specification . . . . . . . . . . . . . . . . . . . . 194

    8.2 Test Generation Overview . . . . . . . . . . . . . . . . . . . . . . . . 197

    8.2.1 Deriving abstract test case SMSC . . . . . . . . . . . . . . . . 197

    8.2.2 Deriving templates . . . . . . . . . . . . . . . . . . . . . . . . 199

    8.2.3 Deriving concrete tests . . . . . . . . . . . . . . . . . . . . . . 206

    8.3 Test Generation Method . . . . . . . . . . . . . . . . . . . . . . . . . 207

    8.3.1 Abstract test-case generation . . . . . . . . . . . . . . . . . . 207

    8.3.2 Template generation . . . . . . . . . . . . . . . . . . . . . . . 213

    8.3.3 Concrete test case generation . . . . . . . . . . . . . . . . . . 222

    8.3.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226

    8.4 Test-execution Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . 226

    8.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231

    8.5.1 Test generation . . . . . . . . . . . . . . . . . . . . . . . . . . 231

    8.5.2 Portability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235

    8.5.3 Test execution . . . . . . . . . . . . . . . . . . . . . . . . . . . 236

    8.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243

  • CONTENTS vii

    IV Conclusion 244

    9 Conclusions and Future Work 245

    9.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247

    9.1.1 Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247

    9.1.2 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249

    A IPC 271

    A.1 Proof of Theorem 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 271

    A.2 Checking spuriousness of execution runs in Murphi . . . . . . . . . . 277

    B IPC Test generation Algorithm genTrace 281

  • viii ABSTRACT

  • ABSTRACT ix

    Abstract

    Distributed reactive systems consisting of classes of behaviorally similar interacting

    processes arise in various application domains such as telecommunication, avionics

    and automotive control. For instance, a telecommunication network with thousands

    of interacting phones (constituting a phone class), or a controller managing hundreds

    of clients requiring latest weather information in an air-traffic control system. Various

    existing modeling notations, such as those included in the UML standard (e.g. State-

    machines and Sequence diagrams), are not well equipped for requirements modeling

    of such systems, since they assume a fixed number of processes in the system.

    Message Sequence Charts (MSCs) and its variants such as UML Sequence-diagrams

    are po

Search related