21
1 ResearchColab Testing Document Team: Reckless 7 Institute of Information Technology University of Dhaka 25 th November, 2016

Software Project Management: Testing Document

Embed Size (px)

Citation preview

1

ResearchColab Testing Document

Team: Reckless 7

Institute of Information Technology

University of Dhaka

25th November, 2016

2

Contents Introduction ................................................................................................................................................... 3

Test Items ...................................................................................................................................................... 3

Test Strategy ................................................................................................................................................. 3

Testing Types ................................................................................................................................................ 3

Required tools for testing .............................................................................................................................. 8

Measures and Metrics ................................................................................................................................... 8

Item Pass/Fail Criteria ................................................................................................................................... 8

Test Deliverables .......................................................................................................................................... 9

Environmental Needs .................................................................................................................................... 9

Responsibilities ............................................................................................................................................. 9

Schedule ...................................................................................................................................................... 10

Planning Risks and Contingencies .............................................................................................................. 10

Test Cases ................................................................................................................................................... 10

Features to be tested .................................................................................................................................... 13

3

Introduction

This is the first release of testing documentation of ‘ResearchCoLab which consists of test items,

strategy, risk issues, needs and an entire outline of testing the application. This document is needed

to be followed in order that anticipated outcome may be confirmed from the project.

Test Items

Testing will consist of several phase, each phase may or may not include testing of any or more of

the following aspects of the application. The aspects are mentioned in the below table.

Accessibility Compatibility Performance

Availability Content Reliability

Response Time Functionality Scalability

Coding Standards Navigation Security

Usability Software Requirements

Specification

Design Specification

Test Strategy

The Test Strategy presents the recommended approach to the testing of the ‘ResearchCoLab

application. The previous section on Test Items described what will be tested; this describes how

it will be tested. The main considerations for the test strategy are the techniques to be used and the

criterion for knowing when the testing is completed. In addition to the considerations provided for

each test below, testing should only be executed using known, controlled databases, in secured

environments.

Testing Types

Function Testing

Function testing of the target-of-test should focus on any requirements for test that can be traced

directly to use cases (or business functions), and business rules. The goals of these tests are to

verify proper data acceptance, processing, and retrieval, and the appropriate implementation of the

business rules. This type of testing is based upon black box techniques, that is, verifying the

application (and its internal processes) by interacting with the application via the GUI and

analyzing the output (results). Identified below is an outline of the testing recommended for each

application.

4

Test Objective Ensure proper target-of-test functionality, including navigation,

data entry, processing, and retrieval.

Technique Execute each use case, use case flow, or function, using valid and

invalid data, to verify the following:

The expected results occur when valid data is used.

The appropriate error / warning messages are displayed when

invalid data is used.

Each business rule is properly applied.

Complementation

Criteria All planned tests have been executed.

All identified defects have been addressed.

Special consideration N/A

User Interface Testing

User Interface testing verifies a user’s interaction with the software. The goal of UI Testing is to

ensure that the User Interface provides the user with the appropriate access and navigation through

the functions of the target-of-test. In addition, UI Testing ensures that the objects within the UI

function as expected and conform to corporate or industry standards

Test Objective: Verify the following:

Navigation through the target-of-test properly reflects business

functions and requirements, including window to window,

field to field, and use of access methods (tab keys, mouse

movements, accelerator keys)

Window objects and characteristics, such as menus, size,

position, state, and focus conform to standards.

Technique: Create / modify tests for each window to verify proper navigation

and object states for each application window and objects.

Completion Criteria: Each window successfully verified to remain consistent with

benchmark version or within acceptable standard

Special Considerations: Not all properties for custom and third party objects can be

accessed.

Data and Dataset Integrity Testing

The databases and the database processes should be tested as a sub-system within the

“ResearchCoLab” application. These sub-systems should be tested without the target-of-test’s

5

User Interface (as the interface to the data). Additional research into the DBMS needs to be

performed to identify the tools / techniques that may exist to support the testing identified below

Test Objective: Ensure Database access methods and processes function properly

and without data corruption.

Technique: Invoke each database access method and process, seeding each

with valid and invalid data (or requests for data).

Inspect the database to ensure the data has been populated as

intended, all database events occurred properly, or review the

returned data to ensure that the correct data was retrieved (for the

correct reasons)

Completion Criteria: All database access methods and processes function as designed

and without any data corruption.

Special Considerations: Testing may require a DBMS development environment or drivers

to enter or modify data directly in the databases.

Processes should be invoked manually.

Small or minimally sized databases (limited number of records)

should be used to increase the visibility of any non-acceptable

events.

Security and Access Control Testing

1. Application-level security, including access to the Data Functions

2. System-level Security, including logging into / remote access to the system.

Application-level security ensures that, based upon the desired security, actors are

restricted to specific functions / use cases or are limited in the data that is available to them.

For example, everyone may be permitted to enter data and create new accounts, but only

managers can delete them. If there is security at the data level, testing ensures that user

“type” one can see all customer information, including financial data, however, user two

only sees the demographic data for the same client.

System-level security ensures that only those actors granted access to the system are capable of

accessing the applications and only through the appropriate gateways

6

Test Objective: Application-level Security: Verify that an actor can access

only those functions / data for which their user type is

provided permissions.

System-level Security: Verify that only those actors with

access to the system and application(s) are permitted to

access them.

Technique: Application-level: Identify and list each actor type and the

functions / data each type has permissions for.

Create tests for each actor type and verify each permission

by creating transactions specific to each user actor.

Modify user type and re-run tests for same users. In each

case verify those additional functions / data are correctly

available or denied.

System-level Access (see special considerations below)

Completion Criteria: For each known actor type, the appropriate function / data

are available and all transactions function as expected and

run in prior function tests

Special Considerations: Access to the system must be reviewed / discussed with

the appropriate network or systems administrator. This

testing may not be required as it may be a function of

network or systems administration.

Configuration Testing

Configuration testing verifies the operation of the target-of-test on different software and hardware

configurations. In most production environments, the particular hardware specifications for the

client workstations, network connections and database servers vary. Client workstations may have

different software loaded (e.g. applications, drivers, etc.) and at any one time many different

combinations may be active and using different resources

Test Objective: Verify that the target-of-test functions properly on the

required hardware / software configurations.

Technique: Use Function Test scripts

Open / close various non-target-of-test related software,

such as the Microsoft applications, Excel and Word, either

as part of the test or prior to the start of the test.

7

Execute selected transactions to simulate actor’s

interacting with the target-of-test and the non-target-of-test

software

Repeat the above process, minimizing the available

conventional memory on the client.

Completion Criteria: For each combination of the target-of-test and non-target-

of-test software, all transactions are successfully

completed without failure.

Special Considerations: What non-target-of-test software is needed is available,

accessible on the desktop?

What applications are typically used?

What data are the applications running (i.e. large

spreadsheet opened in Excel, 100 page document in

Word).

The entire systems, NetWare, network servers, databases,

etc. should also be documented as part of this test.

Load Testing

Load testing is a performance test which subjects the target-of-test to varying workloads to

measure and evaluate the performance behaviors and ability of the target-of-test to continue to

function properly under these different workloads. The goal of load testing is to determine and

ensure that the system functions properly beyond the expected maximum workload. Additionally,

load testing evaluates the performance characteristics (response times, transaction rates, and other

time sensitive issues

Test Objective: Verify performance behaviors time for designated transactions or

business cases under varying workload conditions.

Technique: Use tests developed for Function or Business Cycle Testing.

Modify data files (to increase the number of transactions) or the tests to

increase the number of times each transaction occurs.

Completion Criteria: Multiple transactions / multiple users: Successful completion of the tests

without any failures and within acceptable time allocation.

Special

Considerations:

Load testing should be performed on a dedicated machine or at a

dedicated time. This permits full control and accurate measurement.

The databases used for load testing should be either actual size, or scaled

equally.

8

Required tools for testing

1. Visual Studio 15

2. Bugzilla

3. Firebug

4. Mantis

5. WebStorm

6. Selenium

7. Git

8. Tilt

9. Chrome Developer Tools

10. Browsers (IE 8+, Chrome, Firefox, Opera, Safari)

Measures and Metrics

The following information will be collected by the Development team during the Unit testing

process. This information will be provided to the test team at program turnover as well as be

provided to the project team on a biweekly basis.

1. Defects by module and severity.

2. Defect Origin (Requirement, Design, Code)

3. Time spent on defects that are most critical.

The following information will be collected during the test process by the test team. This

information will be provided to the test manager and project manager on weekly basis. Here the

above there information will also be submitted and in addition with it:

1. Number of times a program submitted to test team as ready for test.

2. Defects located at higher levels that should have been caught at lower levels of testing

Item Pass/Fail Criteria

Data and Database Integrity Testing Criteria: All database access methods and processes

function as designed and without any data corruption.

Function Testing Criteria: All planned tests have been executed. All identified defects have been

addressed.

User Interface Testing Criteria: Each window successfully verified to remain consistent within

acceptable standard

Load Testing Criteria: Multiple transactions / multiple users: Successful completion of the tests

without any failures and within acceptable time allocation.

9

Security and Access Control Testing Criteria: For each known user type the appropriate

function / data are available and all transactions function as expected and run in prior Application

Function tests

Configuration Testing Criteria: For each combination of the Prototype and PC application,

transactions are successfully completed without failure

Test Deliverables

1. Acceptance test plan

2. System/Integration test plan

3. Unit test plans/turnover documentation

4. Screen prototypes

5. Defect/Incident reports and summaries

6. Test logs and turnover reports

Environmental Needs

Few environmental needs are obligatory to address before testing initiates:

1. The development environment needed for testing

2. Extended Lab facility

3. Access to production process

Responsibilities

The following table demonstrates the tasks and the corresponding members responsible:

Test

Manager

Project

Manager

Development

Team

Test Team Client

Acceptance Test ,

Documentation and

Execution

System/Integration test

Documentation & Exec

Unit test documentation &

execution

System Design Reviews

Detail Design Reviews

Test procedures and rules

Screen & Report prototype

reviews

Change Control and

regression testing

10

Schedule

The project is predefined to be submitted by 26 November 2016, therefore the testing must be

completed before at least one week of submission deadline

Planning Risks and Contingencies

Time limitation

More automated testing must be implemented and mote members are to be added in the testing

team in order to recover the time constraint.

Test Cases

Test Suite ID TS-1

Test Case ID TC-1

Test Case Summary Ensure Database access methods and processes function properly and

without data corruption.

Related Requirements N/A

Prerequisites 1. User is authorized.

2. Minimally sized database

Test Description 1. Invoke each database access method and process

2. Seeding each with valid and invalid data (or requests for data)

3. Inspect the database to ensure the data has been populated as

intended, all database events occurred properly, or review the

returned data to ensure that the correct data was retrieved (for the

correct reasons

Test Log

(Pass/Fail

Passed

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

11

Test Suite ID TS-2

Test Case ID TC-2

Test Case Summary Validate System Response time for designated transactions or

functions

Related Requirements N/A

Prerequisites 1. User is authorized.

2. Having a "background" load on the server

Test Description 1. Use Test Scripts developed for Business Model Testing (System

Testing). 2. Modify data files (to increase the number of transactions)

or modify scripts to increase the number of iterations each transaction

occurs.

3. Scripts should be run on one machine (best case to benchmark

single user, single transaction) and be repeated with multiple clients

(virtual or actual, see special considerations below)

Test Log

(Pass/Fail

Passed

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Test Suite ID TS-3

Test Case ID TC-3

Test Case Summary Verify System Response time for designated transactions under

varying workload conditions.

Related Requirements N/A

Prerequisites 1. User is authorized.

2. Load testing should be performed on a dedicated machine or at a

dedicated time.

Test Description 1. Modify data files (to increase the number of transactions) or the

tests to increase the number of times each transaction occur

Test Log

(Pass/Fail

Passed

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

12

Test Suite ID TS-4

Test Case ID TC-4

Test Case Summary Verify that user can access only those functions/data for which their

user type is provided permissions

Related Requirements N/A

Prerequisites 1. User is authorized.

2. Access to the system must be reviewed/discussed with the

appropriate network or systems administrator.

Test Description 1. Identify and list each user type and the functions/data each type has

permissions for.

2. Create tests for each user type and verify permission by creating

transactions specific to each user type.

3. Modify user type and re-run tests for same users. In each case verify

those additional functions/data are correctly available or denied.

Test Log

(Pass/Fail

Passed

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Test Suite ID TS-5

Test Case ID TC-5

Test Case Summary Validate and verify that the client Applications function properly on

the prescribed client workstations

Related Requirements N/A

Prerequisites 1. User is authorized.

2. What PC Applications are available, accessible on the clients?

3. What applications are typically used?

4. What data are the applications running (i.e. large spreadsheet

opened in Excel, 100 page document in Word).

5. The entire systems, network servers, databases, etc. should also be

documented as part of this test

Test Description 1. Use Integration and System Test scripts Open / close various PC

applications, either as part of the test or prior to the start of the test.

2. Execute selected transactions to simulate user activities into and

out of various PC applications.

3. Repeat the above process, minimizing the available conventional

memory on the client

Test Log

(Pass/Fail

Passed

Created by Md Rakib Hossain

Date of Creation 12 November 2016

13

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Features to be tested

As we develop the application throughout our course period, we are able to test the implemented

features. The implemented features that are tested are listed below.

1. User Registration.

2. Log in.

3. Log out.

4. Review Request Posting.

5. Biding review request.

6. Sharing data

7. Requesting for sharing data

8. Search

14

Test Cases for features testing

Test Suite ID TS-6

Test Case ID TC-6

Test Case Name User Registration

Test Case Summary Guest users are registered to the system

Related Requirements N/A

Precondition Navigate to the registration page.

Prerequisites Have to be a non-registered user.

Test Description 1. Visit the sign up page.

2. Enter email address and password.

3. Click ‘Sign up’ button.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail

Comment

1 Click registration ink Navigate to the registration

page

passed

2 Give required data (name

,incorrect email, password,

phone number,) and click

submit

Incorrect email format

passed

3 Give required data (name

,blank email, password, phone

number,) and click submit

Email is required passed

4 Give required data (name ,

email, short password, phone

number,) and click submit

Password is too short passed

5 Give required data (name ,

email, blank password, phone

number,) and click submit

Password is required passed

6 Give required data (name ,

email, password, invalid

phone number,) and click

submit

Invalid phone number passed

7 Give required data (name ,

email, short password, phone

number,) and click submit

Successfully registered passed

15

Test Suite ID TS-7

Test Case ID TC-7

Test Case Name Log in

Test Case Summary Log into the system as a registered user.

Related Requirements N/A

Precondition Have a registered account.

Prerequisites Navigate to the log in page.

Test Description 1. Visit the log in page.

2. Enter valid email address and password.

3. Click ‘Log in’ button.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail

Comment

1 Click registration ink Navigate to the registration page passed

2 Give required data

(incorrect email format,

password) and click Log in

Incorrect email format

passed

3 Give required data (name

,blank email, password,

phone number,) and click

submit

Email is required passed

4 Give required data (name ,

email, short password,

phone number,) and click

submit

Password is too short passed

5 Give required data (name ,

email, blank password,

phone number,) and click

submit

Password is required passed

16

Test Suite ID TS-8

Test Case ID TC-8

Test Case Name Log out

Test Case Summary Log out from the system

Related Requirements N/A

Precondition Must be logged in.

Prerequisites Navigate to log out link

Test Description 1. Click the ‘Log out’ button.

2. Get out of the system.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail

Comment

1 Click log out Prompt the user for logging out passed

2 Click confirmation Log out successfully passed

17

Test Suite ID TS-9

Test Case ID TC-9

Test Case Name Review Request Posting

Test Case Summary Authenticated User can post a review request.

Related Requirements N/A

Precondition Must be logged in

Prerequisites Navigate to Post Review Request option.

Test Description 1. Select ‘Post Review Request’.

2. Enter title.

3. Give description.

4. Add expertise tag.

5. Click on ‘Post’ button.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail

Comment

1 Click Post review Request Link Review request form passed

2 Give required Data(

Tittle, description, tag) and click

Post button

Request posted Successfully passed

3 Give required Data(

Blank Tittle, description, tag)

and click Post button

Tittle is required passed

4 Give required Data(

Tittle, blank description, tag)

and click Post button

Description is required passed

5 Give required Data(

Tittle, description, blank tag)

and click Post button

Tag is required passed

6 Give required Data(

Long Tittle, description, tag)

and click Post button

Tittle is too long passed

7 Give required Data(

Tittle, description, long tag) and

click Post button

Tag is too long passed

18

Test Suite ID TS-10

Test Case ID TC-10

Test Case Name Bidding Review Request

Test Case Summary Authenticated user can bid a review post.

Related Requirements N/A

Precondition Must be logged in.

Prerequisites Navigate to review request posts

Test Procedures 1. View a review request.

2. Click on the “Bid” button.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail

Comment

1 Click on a review

request post.

Details of the review request is shown passed

2 Click the Bid button Successfully bid the request. passed

19

Test Suite ID TS-11

Test Case ID TC-11

Test Case Name Sharing Data

Test Case Summary Authenticated User can share data

Related Requirements

Precondition Must be logged in

Prerequisites

Test Description 1. Select ‘Share Data option’.

2. Enter title.

3. Give description.

4. Add expertise tag.

5 Add Data File

6. Click on ‘Share’ button.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail

Comment

1 Click Post review Request Link Review request form passed

2 Give required Data(

Tittle, description, file tag) and

click Post button

Successfully shared data passed

3 Give required Data(

Blank Tittle, description, file,

tag) and click Post button

Tittle is required passed

4 Give required Data(

Tittle, blank description, tag)

and click Post button

Description is required passed

5 Give required Data(

Tittle, description, blank tag)

and click Post button

Tag is required passed

6 Give required Data(

Tittle, description, invalid file

format tag) and click Post button

File format is invalid passed

7 Give required Data(

Tittle, description, long size file

tag) and click Post button

File size is too large passed

20

Test Suite ID TS-12

Test Case ID TC-12

Test Case Name Requesting for Shared Data

Test Case Summary Authenticated user can request for shared data.

Related Requirements

Precondition Must be logged in.

Prerequisites

Test Procedures 1. View a shared data

2. Click on the “Request Data” button.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail

Comment

1 Click on a Shared Data

item.

Details of the data is shown passed

2 Click the “Request

Data” button

Request is done successfully. passed

21

Test Suite ID TS-13

Test Case ID TC-13

Test Case Name Search into the system (User, Review Post, Shared data, Discussion

board)

Test Case Summary Authenticated user can search into the system

Related Requirements N/A

Precondition Must be logged in.

Prerequisites Navigate to search box

Test Procedures 1. Enter search keyword

2. Click on the “Search” button.

Created by Md Rakib Hossain

Date of Creation 12 November 2016

Executed by Md Rakib Hossain

Date of Execution 15 November 2016

Test Environment OS: Windows 10

Browser: Google Chrome

Step Action Expected System response Test Log

(Pass/Fail)

Comment

1 Give search keyword

and click on “Search

“button

Show the list of searched items. passed

2 Give search keyword

and click on “Search

“button

Nothing is found with this keywords. passed