28
1 SAS_06_Testing_Framework_Chen A Testing Framework for Reproducible Execution and Race Condition Detection in Real-time Embedded Systems Ken Chen, JSC Eric Wong, UT at Dallas Yann-Hang Lee, ASU

SAS_06_Testing_Framework_Chen A Testing Framework for

Embed Size (px)

Citation preview

Page 1: SAS_06_Testing_Framework_Chen A Testing Framework for

1SAS_06_Testing_Framework_Chen

A Testing Framework for Reproducible Execution and Race Condition Detection in

Real-time Embedded Systems

Ken Chen, JSCEric Wong, UT at DallasYann-Hang Lee, ASU

Page 2: SAS_06_Testing_Framework_Chen A Testing Framework for

2SAS_06_Testing_Framework_Chen

Motivation Real-time embedded systems are widely deployed in

NASA missions Manned or un-manned space vehicles

Often exhibit temporal-dependent non-deterministic behavior, and thus extremely difficult to test Threads may interact in an unpredictable manner due to

scheduling and synchronization Interaction with physical environment can be unpredictable,

such as interrupts, timer, and changes of sensor values.

How to verify the temporal behavior of real-time embedded systems in the presence of non-determinism? or

Will the software behave similarly when the interval between the arrivals of two interrupt events is 1 sec, 2 sec, 3 sec….?

Page 3: SAS_06_Testing_Framework_Chen A Testing Framework for

3SAS_06_Testing_Framework_Chen

Challenges

How to characterize a non-deterministic execution caused by temporal dependency?

How to control an otherwise non-deterministic execution such that an execution can be reproducible (for debugging and test analysis)?

How to derive the possible deviations of a non-deterministic execution?

Page 4: SAS_06_Testing_Framework_Chen A Testing Framework for

4SAS_06_Testing_Framework_Chen

A Testing Framework

A platform-independent approach to deterministic execution Can trace/replay an execution or force a specified test

sequence to be exercised Follows the same synchronization and IO event sequences Time-stamped events to recoup timing information Works at a higher level of abstraction

A systematic approach to derive possible deviations of a non-deterministic execution Based on static/dynamic code analysis Does not require any formal specification of the system

behavior

Page 5: SAS_06_Testing_Framework_Chen A Testing Framework for

5SAS_06_Testing_Framework_Chen

The Testing Process

Synchronization and I/O event trace

Conduct a test run

Race variants

Reproducible execution

Dynamic and race analyses

Page 6: SAS_06_Testing_Framework_Chen A Testing Framework for

6SAS_06_Testing_Framework_Chen

Overview of Tool Environment

Static analysis (control flow and data dependence)

Dynamic analysis(execution flow, timing,

synchronization, and I/O operations)

Run test cases in target environment

Model of eventsand program

execution

Model deduction from multiple test

runs

Timing and race condition

verification

Create new event occurrences from

uncovered intervals

Analysis

Instrumentation

Page 7: SAS_06_Testing_Framework_Chen A Testing Framework for

7SAS_06_Testing_Framework_Chen

Record/Replay Framework

Need to have execution trace for race condition analysis (Event type and event sequence plus timing)

Record event sequence between threads and with environment

Replay to reproduce the identical sequence or (relative) timing Dynamic analysis such as coverage and slicing, Test case

generation, Debugging Related work

Software Instruction Counter (Count backward branches etc.) subroutine calls)

Deterministic Java Replay Utility (KVM – an interpreter) Complete System Simulation (Instruction set simulation) Time Machine (Register context)

Page 8: SAS_06_Testing_Framework_Chen A Testing Framework for

8SAS_06_Testing_Framework_Chen

System Architectures

Relative versus Exact Replay Execution

App 1 App 2 App 3System Task

System Call Recorder

VxWorks

Board Support PackagesIO Driver

App 1 App 2 App 3System Task

System Call Recorder

VxWorksSystem

CallGenerator

Board Support PackagesIO Driver

Exact Execution (Interrupt Replay)Relative Execution (OS_level Replay)

Page 9: SAS_06_Testing_Framework_Chen A Testing Framework for

9SAS_06_Testing_Framework_Chen

OS_level Record/Replay

Record event start and end marks Replay execution defers until next event All results returned from buffer in Replay Corporative execution based on event order

Event Log1 2 3 4 5 6 7 8Event Log

Recording12 3 45678

Replaying

Page 10: SAS_06_Testing_Framework_Chen A Testing Framework for

10SAS_06_Testing_Framework_Chen

Framework

App 1 App 2 App 3 System Task

VxWorks

Board Support PackagesIO Driver

Record/Replay Task

Target System (IXP1200)

Workstation

Server

Serial Link

Ethernet Router

Page 11: SAS_06_Testing_Framework_Chen A Testing Framework for

11SAS_06_Testing_Framework_Chen

Design Considerations for AERCam

Scheduling Preemptive Priority Scheduling Ready, Pending, Delay, Suspend Execution Context Priority 0 to 255, System Tasks use 0 to 100 Task Name

IPC Application versus Interrupt Context Timeout

Signals Synchronous versus Asynchronous Generation & Delivery

Current implementation supports semTake, semGive, msgQSend, msgQReceive, signal, kill, read, and write

Page 12: SAS_06_Testing_Framework_Chen A Testing Framework for

12SAS_06_Testing_Framework_Chen

Interrupt Record/Replay

Vector Table

InterruptRecorder

VxWorks_intEnt Device

ISR

IPC 1

IPC 2

IRQ 2

IPCRecorder

IPCRecorderScheduler

Reserved Memory

Interrupt Recording

Program Code

InterruptReplaying

Breakpoint Task Manager

VxWorksExceptionHandler

IPC 1

IPC 2

InterruptEventEvent Log IPC 1

EventIPC 2Event

Record interrupt, system calls, context switches Replay exact execution sequence Re-execute all systems calls

Page 13: SAS_06_Testing_Framework_Chen A Testing Framework for

13SAS_06_Testing_Framework_Chen

RESTA

A tool suite for Real-time Embedded Software Testing and Analysis

Challenges A massive amount of execution trace is collected that is

not only complex but also difficult to interpret An urgent need to provide a methodology, supported by

a tool, to automatically analyze the data and present them from different views for better understanding

Twofold objectives Visually re-create the program execution to gain insight

into its dynamic behavior Present data retrieved from the execution trace in

different perspectives to aid in quality assurance, performance improvement, etc.

Page 14: SAS_06_Testing_Framework_Chen A Testing Framework for

14SAS_06_Testing_Framework_Chen

Our Approach

Take the advantage of the concept of “Data Visualization” Experience has shown that graphical visualization can help us

significantly better understand complex phenomena and large amounts of complex data

Through different visualization of the execution trace, various graphs are generated to help us deduce what really happened during the program execution

Features in RESTA Message Graph Race Condition Graph Semaphore Graph Task Active Graph Program Execution Graph (Coverage Summary and Source Code

Display Graphs)

Page 15: SAS_06_Testing_Framework_Chen A Testing Framework for

15SAS_06_Testing_Framework_Chen

Design and Implementation Philosophy (1)

Portability RESTA is implemented in Java which can run on many different platforms

such as Windows, UNIX, and Linux.

Scalability Many of the existing tools are not very scalable.

Their limitations are readily apparent when large, complex data are analyzed. Data collected in our studies can be large and complex.

Good Visualization All the visual displays be intuitively meaningful.

Easy of Understanding Information presented by each graph should be self-explanatory and obvious

to its users Even if a user has to spend little effort to understand a graph for the first

time, the same user should easily recall what he or she learned when seeing this graph again

Page 16: SAS_06_Testing_Framework_Chen A Testing Framework for

16SAS_06_Testing_Framework_Chen

Design and Implementation Philosophy (2)

Easy of Use The use of a tool should reduce, not increase, either the stress or the

boredom of its users Provide an interactive, mouse-click-, or menu-oriented interfaces for

invoking different features with customized

Diversity No single graph can offer a full view of the behavior and the data associated

with the program execution

Provide different graphs to represent views from different perspectives

Extensibility New features for different views will continuously be included RESTA should adopt a flexible architecture/design to make such extension

easy and feasible

Page 17: SAS_06_Testing_Framework_Chen A Testing Framework for

17SAS_06_Testing_Framework_Chen

Message Graph (1)

Displaying message-passing between different tasks

Assume a program with seven tasks running simultaneously on a single CPU and the following message-passing between different tasks (1) Task 1 sends a message to Task 2 (2) Task 3 sends a message to Task 2 (3) Task 2 receives the message sent by Task 3 at Step (2) (4) Task 2 receives the message sent by Task 1 at Step (1) (5) Task 1 sends a message to Task 3 (6) Task 3 receives the message sent by Task 1 at Step (5) (7) Task T6 sends a message to T3 (8) Task T7 sends a message to T3 (9) Task T3 receives the message from Task T7 (10) Task T3 receives the message from Task T6

Page 18: SAS_06_Testing_Framework_Chen A Testing Framework for

18SAS_06_Testing_Framework_Chen

Message Graph (2)

Page 19: SAS_06_Testing_Framework_Chen A Testing Framework for

19SAS_06_Testing_Framework_Chen

Message Graph (3)

Page 20: SAS_06_Testing_Framework_Chen A Testing Framework for

20SAS_06_Testing_Framework_Chen

Message Graph (4)

Page 21: SAS_06_Testing_Framework_Chen A Testing Framework for

21SAS_06_Testing_Framework_Chen

Race Condition Graph (1)

Displaying possible race conditions due to message-passing between different tasks Due to the synchronization among different senders and

receivers

Two receivers with possible race conditions are highlighted in red

Page 22: SAS_06_Testing_Framework_Chen A Testing Framework for

22SAS_06_Testing_Framework_Chen

Race Condition Graph (2)

Clicking on the red arrow of the first receiver, we have race conditions displayed as follows The receiver is surrounded by a red square. The two competing senders are circled in green. The corresponding source code of this receiver and the two

senders is displayed in red and green, respectively, in another pop-up window.

Race conditions with respect to the first receiver highlighted in red

Page 23: SAS_06_Testing_Framework_Chen A Testing Framework for

23SAS_06_Testing_Framework_Chen

Race Condition Graph (3)

Clicking on the red arrow of the second receiver, we have race conditions displayed as follows

The receiver (Task 3) may receive a message from Tasks 1, 6 or 7

Page 24: SAS_06_Testing_Framework_Chen A Testing Framework for

24SAS_06_Testing_Framework_Chen

Semaphore Graph Displaying how tasks take and give semaphores Example: mutual-exclusion semaphores

Task 2 takes SEM1 at tα and gives at tγ

Task 3 waits for SEM1 from tβ to tγ before it can take the semaphore at tγ

Page 25: SAS_06_Testing_Framework_Chen A Testing Framework for

25SAS_06_Testing_Framework_Chen

Task Active Graph

Displaying when each task is active

Page 26: SAS_06_Testing_Framework_Chen A Testing Framework for

26SAS_06_Testing_Framework_Chen

Program Execution Graph: Coverage Summary

Providing a visualization of how the program is executed by each task

Coverage Summary Graph Report code coverage (basic block and decision) for each task Other criteria such as all-synchronizable-sender-receiver-pairs, all-

concurrency-paths, etc. Coverage with respect to the entire program Coverage with respect to the modules executed by each task

Page 27: SAS_06_Testing_Framework_Chen A Testing Framework for

27SAS_06_Testing_Framework_Chen

Program Execution Graph: Source Code Display

Code in white has already been executed by the task Code in color is prioritized in terms of increasing the

coverage Compute the priorities by

using a dominator/superblock analysis

Priorities are displayed as numbers in the color spectrum at the top

Page 28: SAS_06_Testing_Framework_Chen A Testing Framework for

28SAS_06_Testing_Framework_Chen

Summary

Our overall objective is to provide a reproducible execution and graphical displays/summaries with respect to various analytical analyses to study the program behavior and to improve its quality, dependability, safety, performance, etc.

Current work A layered testing environment Finalize the “prefix test sequence plus non-deterministic run”

method to identify various execution paths caused by timing variants (in compliance with an input event constraint model)

Enhance the current version of RESTA by including additional analysis

Conduct a case study on AERCam (Autonomous Extravehicular Robotic Camera)

• Work with NASA JSC quality assurance engineering team to apply our research results to the AERCam real-time simulation/testing environment, thereby realizing autonomous operation capabilities with a high level of assurance.

Publish and present our research results