19
CESE Automated Testing of Large Multi-language SW Systems using Cloud Computing Technical Presentation Principal Investigator (PI): Dr. Mikael Lindvall, CESE NASA POC: Markland Benson, White Sands Team members: Dharma Ganesan, Dr. Chris Ackermann(CESE) GMSEC, CFS, Space Network, MSL 1 © 2011 Fraunhofer USA, Inc. Center for Experimental Software Engineering

Automated testing of NASA Software - part 2

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Automated testing of NASA Software - part 2

CESE

Automated Testing of Large Multi-language SW Systems using Cloud

Computing

Technical Presentation

Principal Investigator (PI): Dr. Mikael Lindvall, CESENASA POC: Markland Benson, White Sands

Team members: Dharma Ganesan, Dr. Chris Ackermann(CESE)

GMSEC, CFS, Space Network, MSL

1© 2011 Fraunhofer USA, Inc. Center for Experimental Software Engineering

Page 2: Automated testing of NASA Software - part 2

CESE

Problems

• Test cases are developed manually• Some test execution is automated (e.g., JUnit)• Test cases miss valid “corner” cases• Test cases are also programs: not easy for non-

technical stakeholders to understand• Difficult to summarize what was tested

Approach: Lightweight Model-based Test Generation and Execution

2© 2011 Fraunhofer USA, Inc. Center for Experimental Software Engineering

Page 3: Automated testing of NASA Software - part 2

CESE

Test Generation Workflow

Page 4: Automated testing of NASA Software - part 2

CESE

The structure of the approach

ITestMonkey

TestMonkeyImpl ITestDataProvider

TestDataProviderImpl

Model

System Under Test

Interface

List of abstract actions

List of abstract input data provider methods

Model is agnostic to the test execution technology ITestMonkey interface hides the test execution framework TestMonkeyImpl uses interfaces of the test execution framework

Page 5: Automated testing of NASA Software - part 2

CESE

Tools Infrastructure• Modeling: Yed Graphical Editor from yWorks• Model traversal and test generation: Jumbl- Uni. Tennessee• Test Execution:

– Junit (Java) – CuTest (C)– Selenium (Web)– UISpec (Java Swing)– Sikuli (Image-based testing of legacy systems)

• Glue scripts:– Conversion of Yed models to Jumbl models– Preparing a test suite from generated test cases– Generation of system-specific build files (e.g., makefiles)– Helper scripts to clean-up generated files

Page 6: Automated testing of NASA Software - part 2

CESE

Test Generation @ GMSEC …

• State-of-the-practice: Test cases are hand-crafted

• New initiative started to evaluate the feasibility of the FAST approach

• Modeled a portion of the GMSEC Software Bus based on existing test cases and documentation

• Automatically generated test cases

• Found a few problems (already fixed now)

Page 7: Automated testing of NASA Software - part 2

CESE

Hand-crafted test case (snippet) public static void main( String args[] ) { Status result = new Status(); Connection conn = new Connection(); ConnectionConfig cfg = new ConnectionConfig( args );

// Create the connection result = ConnectionFactory.Create( cfg, conn ); checkError( false, result, "Creating the connection object" );

// Disconnect result = conn.Disconnect(); checkError( true, result, "Disconnecting before connection is established" );

// Connect result = conn.Connect(); checkError( false, result, "Establishing the connection to the middleware" );

} //..main()

Page 8: Automated testing of NASA Software - part 2

CESE

Manually developed test cases – source of Inspiration

• We reviewed existing Java test cases

• Found that the tester has used certain

permutations of API-usage

• Also, both good and “evil” cases are considered

• We used these test cases as a source of reference for building API usage models

Page 9: Automated testing of NASA Software - part 2

CESE

Test Generation @ GMSEC …public interface IConnection{

public Status Connect();

public Status Disconnect(); …}

APIs of the module under test

Page 10: Automated testing of NASA Software - part 2

CESE

Structure of cFE/CFS

Page 11: Automated testing of NASA Software - part 2

CESE

Structure of OSAL

Page 12: Automated testing of NASA Software - part 2

CESE

Sample APIs

/******************************************************************************** Directory API ******************************************************************************/// Makes a new directoryint32 OS_mkdir (const char *path, uint32 access);

// Opens a directory for searchingos_dirp_t OS_opendir (const char *path);

// Closes an open directoryint32 OS_closedir(os_dirp_t directory);

// Removes an empty directory from the file system.int32 OS_rmdir (const char *path);

Page 13: Automated testing of NASA Software - part 2

CESE

Example of an OSAL model

Page 14: Automated testing of NASA Software - part 2

CESE

API doc of open directory/*-------------------------------------------------------------------------------------- Name: OS_mkdir

Purpose: makes a directory specified by path.

Returns: OS_FS_ERR_INVALID_POINTER if path is NULL OS_FS_ERR_PATH_TOO_LONG if the path is too long to be stored locally OS_FS_ERR_PATH_INVALID if path cannot be parsed OS_FS_ERROR if the OS call fails OS_FS_SUCCESS if success

Note: The access parameter is currently unused.---------------------------------------------------------------------------------------*/

int32 OS_mkdir (const char *path, uint32 access);

Page 15: Automated testing of NASA Software - part 2

CESE

Inside Open Invalid Directory

Page 16: Automated testing of NASA Software - part 2

CESE

Sample IMonkey Interface

• int32 removeDirectoryValid(void);• int32 removeDirectoryPathNull(void);• int32 removeDirectoryPathTooLong(void);• int32 removeDirectoryPathUnparsable(void);• int32 removeDirectoryCurrent(void);• int32 removeDirectoryNotEmpty(void);• …

Page 17: Automated testing of NASA Software - part 2

CESE

Sample generated Test in CuTest

void Testosal_Filesystem_min_2(CuTest* tc) { status = makeFilesystemValid(); CuAssertIntEquals_Msg(tc,"Filesystem could not be created", OS_FS_SUCCESS, status);

status = mountFilesystemValid(); CuAssertIntEquals_Msg(tc,"Filesystem could not be mounted", OS_FS_SUCCESS, status); pointer = openDirectoryValid(); CuAssertTrue(tc, pointer != NULL); … status = removeFilesystemValid(); CuAssertIntEquals_Msg(tc,"Filesystem could not be removed”, status);}

Page 18: Automated testing of NASA Software - part 2

CESE

Issues found using this method

• File-descriptors after removing file-system:• After somewhat long tests we would run out of

file-descriptors• This would even happen with a newly created

file-system• OSAL does not remove file-descriptors for files

open when the file-system is removed• Unable to create and open files• Some wrong error codes returned

Page 19: Automated testing of NASA Software - part 2

CESE

Current Results• An end-to-end approach for test generation

– Successfully used on the GMSEC and the CFS and detected bugs

Next steps are to apply to the Space Network and the MSL projects

19© 2011 Fraunhofer USA, Inc. Center for Experimental Software Engineering