1 - Master Test Plan - Template

  • Published on

  • View

  • Download

Embed Size (px)


Master Test Plan


Test Plan


Document Control

Document DetailTitle:MASTER TEST PLAN



Electronic File Name:Master Test Plan

Electronic File Location:https://github.com/rgtacho/JSF/tree/master/Practica15/Test Strategy

AuthorAnastacio Rodrguez Garca

ContributionsAnastacio Rodrguez Garca, Ricardo Rodrguez Garca, Marcelo Rodrguez Garca, Itzayana Coral Rodrguez Colmenero, Magdalena Colemero Gonzlez.

Revision ChartDateVersionDescription of ChangesAuthorReviewed by

13-12-20151.0Creacin del documento.Anastacio Rodrguez GarcaItzayana Coral Rodriguez Colmenero

Referenced DocumentationRefDocument NameElectronic File Location

1Matriz de Pruebashttps://github.com/rgtacho/JSF/tree/master/Practica15/Test_Strategy/matriz_pruebas.pdf

2Casos de pruebahttps://github.com/rgtacho/JSF/tree/master/Practica15/Test_Strategy/casos_prueba.pdf

3Bug trackinghttps://github.com/rgtacho/JSF/tree/master/Practica15/Test_Strategy/bug_tracking.pdf



6Documento Finalhttps://github.com/rgtacho/JSF/tree/master/Practica15/Test_Strategy/DoctoFinalTesting.pdf

Table of Contents41.Introduction

1.1Description41.2Purpose41.3Intended Audience41.4Scope41.5Out of Scope51.6Approach51.7Assumptions, Constraints and Dependencies61.7.1Assumptions61.7.2Constraints61.7.3Dependencies61.8Risks61.9References71.10Roles and Responsibilities72.Test Strategy92.1Testing Types92.1.1Smoke Test92.1.2Functional Testing92.1.3Defect Verification102.1.4Regression Testing112.1.5Performance Testing112.1.6Integration Testing122.1.7Production Validation Testing132.1.8Other Tests132.2Test Cycle Suspension and Resumption Criteria142.3Tools152.4Issue and Defect Verification and Validation Procedures152.5Roles152.6System Environments and Infrastructure163.Project Milestones164.Deliverables164.1Key Deliverables165.Management Process and Procedures165.1Post Mortem and Lessons Learned165.2Critical Success Factors165.3Change Management166.Appendix A High Level Test Matrix167.Appendix B - Glossary of Terms168.Sign-Off16

Test Plan1. Introduction1.1 DescriptionThis Master Test Plan defines the approach to testing the SISTEMA DE CONTROL DE USUARIOS 1.0 release. It briefly describes the methods and tools used by the Quality Assurance team to validate the performance of the system.

1.2 PurposeThe purpose of this document is to outline the approach that the Quality Assurance team will take to ensure that the Functional Acceptance criteria are met. Specifically, this document details the: Functional Acceptance Criteria

Workload Distribution used to exercise the system

Testing Schedule





Test Types

Data and data management issues

1.3 Intended AudienceThe intended audience for this document includes:

ICW Steering Committee membership

Project Managers

QA Team Members

Business Analysts

1.4 ScopeTesting efforts for release [x.y] will focus on the following: [Release-specific Scope Items 1 - n] Defect fixes

Smoke Testing

Integration Testing

Regression Testing

Production validation1.5 Out of Scope [Non-Functional Testing

This may be added to scope based on project analysis] Ad-hoc Testing

Exceptions to this rule include:

Late additions to scope via the PCR process

Any unforeseen requirements introduced during the testing cycle that are outside the documented PCR process

Last minute fix to production defects

User Acceptance Testing (UAT)

This will be the responsibility of our business partners

Tests executed by the UAT team will be excluded from the scope of ICW QA testing [Features specific to a third party application that are non-impacting to the overall ICW testing effort] applies if working with a third party vendor This will be the responsibility of the [third party vendor] Tests executed by the [third party vendor] will be excluded from the scope of ICW QA testing

1.6 ApproachTesting tasks will be conducted in line with the Software Test Life Cycle (STLC) and in support of the Software Development Life Cycle (SDLC). The documents used within the SDLC will be completed both by the QA Team and the project participants that are responsible for providing information and deliverables to QA.

Testing will be conducted in the line of the Agile strategy as shown in this High level diagram.

Important testing approach considerations:

QA Engineer is responsible of validating expected results at points defined in testing process diagrams

When an error arises, individual QA Engineer will log defects in Service-Now, follow up with develops who will communicate when the fix has been implemented to conduct another test cycle.

QA Engineer will be responsible for tracking and follow-up of defects associated to their systems (in accordance with the Defect Tracking Section).

Developers will provide an ETA for fixing the errors reported

End-to-end test cases will cover the entire functionality and only during a test failure, a detailed research of pre-defined checkpoints will take place.

1.7 Assumptions, Constraints and Dependencies

1.7.1 Assumptions Test cases will be designed to validate:

Business requirements designated for this release

In-scope production defects [Third party vendor] features that directly impact application under test

Additions to scope made through the PCR process are subject to ad-hoc testing

QA will not accept changes after delivery of the final release candidate

QA will consider testing any change after delivery using an ad-hoc approach

QA testing will occur in parallel with the UAT testing effort at a point during the test cycle

QA will focus on release-specific improvements and new features first, ensuring that defects encountered in these areas are uncovered earlier in the testing cycle

Each area of testing will be certified by their respective testing organizations [required when working with third party vendors] It is further assumed that the ICW QA organization will not execute testing in areas certified by the [third party vendor] QA organization.1.7.2 Constraints The project must operate within the following limits:

Time testing tasks will be constrained by time due to QA workload

Required system resources if the application is not accessible it will impact the testing effort1.7.3 Dependencies The time that QA can test specific functionality is dependent on accurate delivery by development

QA Engineer or other project team members may be affected by multiple project activities

Requirements documentation has been provided

Required security access for QA Engineer

Test cases have been developed and signed off

1.8 RisksList any risk factors to the testing effort and mitigation plans should the risk come to realization.


Environmental issues encountered during testing may potentially impact the testing schedule [Example]QA will collaborate with I&O during testing to ensure stability of the environment during test phases. Should issues arise testing will be done in the Staging environment. [Example]

The discovery of a critical defect late in the testing cycle may potentially impact the production delivery schedule [Fill-in]

Any addition to the defined scope of testing will potentially impact the successful completion of the testing effort within the time allotted[Fill-in]

Late changes to any documentation may result in incomplete testing of in scope features[Fill-in]

Functional defects encountered in areas certified by [third party vendor] may be introduced into the production environment as these areas are not being evaluated by the ICW QA team[Fill-in]

Intermittent issues detected late in the test cycle or not at all may be introduced into the production environment.[Fill-in]

Resource allocations have potential impact on the project. Resources from all levels of the project (PM, BA, QA, Dev) can be assigned to multiple projects which can adversely affect timelines. [Fill-in]

1.9 References The following key reference material will be used to create the necessary test cases: [List all documents appropriate to the application under test]

1.10 Roles and ResponsibilitiesList the Roles and Responsibilities in terms of QA activities defining resource activities.

Role Resource

Project Manager[Fill-in]

Business Analyst[Fill-in]

QA Supervisor[Fill-in]

Dev. Lead[Fill-in]

Solution Architect[Fill-in]

QA Lead[Fill-in]

ActivityPMBADev TeamQA Team

Defining, Planning and Team Organization.X

Provide Business Requirements, Data Mapping, Use Cases and any additional information needed to develop the QA processes.X

Development Team Activities AssignationX


Unit TestX

Release DeliverX

Test Planning and EstimationX

Review and Sign off Test PlanXXXX

Testing DocumentationX

Provision of Test Environment Set-upXX

Provision of Unit Tested Test ItemsX

Test Preparation and ExecutionX

Ongoing Test ReportingX

Tracking DefectsX

Test Summary Reporting X

Bug fixes and return to QA for re-testX

Re-Testing ExecutionX

QA Completion of ActivitiesX

QA Sign OffX

Deploys in ProductionX

2. Test Strategy

2.1 Testing Types

The main considerations for this document going forward are the techniques to be used and the criteria for knowing when testing is completed.

2.1.1 Smoke TestSmoke testing is used in both the lower tier and the production environments to validate the integrity of a build being introduced into the environment. This testing occurs prior to the beginning of a formal test cycle. Based on the execution of a very limited set of test cases, a determination will be made as to whether or not testing in the lower tier environment can proceed. If the results of testing are not successful, testing will not begin until any identified blocking issues are remediated to the extent that testing can begin.

Test Objective:Ensure that the most basic functions of the application that has been delivered are stable enough to execute a full test cycle.

Technique:Execute each test case which includes primary path to verify the following

The expected results occur when valid data are used.

Completion Criteria:Testing is complete when:

All planned tests have been executed.

All identified defects have been addressed to the extent that the test cycle can proceed.

Special Considerations:A 100% test case pass rate is required to consider the application to be stable enough to proceed.

Intermittent issues may not be addressed until after they are reproduced in the lower tier environment once the formal test cycle has begun

Should smoke testing result in a subsequent deployment to remediate an identified issue, test re-execution will occur once deployment has occurred

Smoke testing verifies the basic functionality of the application under test and does not exercise all areas of system.

2.1.2 Functional Testing Functional testing for the [x.y] release will comprise of executing newly created or updated test cases that will be needed to support the [highlight changes] being delivered in this release. In addition, tests on a number of defects, primarily from production, may also be performed. Please note that these figures exclude scope additions that are the result of the PCR process. Testing will focus on the requirements that can be traced directly to relevant use cases, defect reports, or business functions and business rules. The goals of these tests are to verify proper data acceptance, processing, retrieval, and the appropriate implementation of business rules. As mentioned earlier, additions to scope made through the PCR process are subject to ad-hoc testing. All in-scope functional testing is based on black box techniques. These techniques are used to verify the application and its internal processes by interacting with the application via the Graphical User Interface (GUI) and analyzing the output or results. Identified below is an outline of the testing recommended for each application:

Test Objective:Ensure proper functionality, including navigation, data entry, processing, and retrieval.

Technique:Execute each test case which includes primary path as well as alternate and negative flows, to verify the following:

The expected results occur when valid data is used.

The appropriate error or warning messages are displayed when invalid data or business flows are used.

Each business rule is properly applied.

Completion Criteria:Testing is complete when:

All planned tests have been executed

All identified defects have been addressed or validated and accepted by our business partners as known issues to be promoted to production.

Special Considerations:A minimum 85% test case pass rate is required to consider the application to be stable No major defects remain openIntermittent issues may not be addressed until after they are reproduced in production environment.

Defect severity may be modified by our business partners

PCR-related additions to scope are subject to testing using ad-hoc techniques

2.1.3 Defect VerificationDefect verification will be performed prior to the functional and regression test. This verification will ensure that all defects that are in scope for the release has been fixed and addressed in the current release.

Test Objective:Ensure production defects targeted to be fixed in the current release have been addressed.

Technique:Execute test cases that specifically target the defects under test

Completion Criteria:Testing is complete when:

All identified defects have been tested and validated

Special Considerations:Intermittent defects may or may not be reproducible in lower tier environments

Some production defects may require specific production data for validation Failure of a validation may or may not hinder the deployment to production

2.1.4 Regression TestingA regression test validates that application functionality did not degrade after the introduction of updated code into the production environment. To this end, a number of tests, created and executed over the course of several releases but not specific to the functionality being delivered, will be executed.

Test Objective:Ensure continued proper operation of existing functionality according to required business processes of in scope items.

Technique:Testing will simulate several business processes by performing the following:

Execution of all critical test scenarios as defined by the Quality Assurance team. This comprises approximately 30% of the regression test s...