Upload
wmingen
View
219
Download
0
Embed Size (px)
Citation preview
8/8/2019 QA Standards 2008
1/30
Quality Assurance ProcessRevision 2008
CONFIDENTIALITY NOTE: THIS DOCUMENT, INCLUDING ANY AND ALL ATTACHMENTS, CONTAINS CONFIDENTIAL INFORMATION INTENDED ONLY FOR THE USEOF TG. IF THE RECIPIENT OF THIS MESSAGE IS NOT AN AUTHORIZED PARTY OR AN EMPLOYEE OF AN AUTHORIZED PARTY, YOU ARE HEREBY NOTIFIED THATREADING, DISCLOSING OR EXPLOITING THIS DOCUMENT IS STRICTLY PROHIBITED. IF YOU HAVE RECEIVED THIS DOCUMENT IN ERROR, PLEASE IMMEDIATELYRETURN IT TO THE SENDER AND DELETE IT FROM ANY DOCUMENT STORAGE SYSTEM.
TG 1 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
1
8/8/2019 QA Standards 2008
2/30
Project/Documentation Information
Contact Information
QA Lead: Trinity Sheil Phone: X4931
Document History
TG 2 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Revision Name Date Comments
1.0 Trinity Sheil 10/10/2007 Initial Draft
1.1 Trinity Sheil 02/26/2008 Update custom information
1.2 Trinity Sheil 03/03/2008 Change Title
2
8/8/2019 QA Standards 2008
3/30
Table of Contents
1. Introduction .............................................................................. 5
2. Reference Documents ............................................................... 6
3. Scope ....................................................................................... 7
4. Activities & Deliverables ............................................................ 8
4.1. Requirements Analysis & Traceability ................................... 8
4.2. Risk Identification & Mitigation ............................................ 8
4.3. Project Milestones ............................................................... 9
4.4. Records Collections and Retention ........................................ 9
4.5. Test Case Design ................................................................. 9
4.6. Test Case Review .............................................................. 10
5. Testing ................................................................................... 11
5.1. External Testing (UAT) ....................................................... 11
5.2. Test Environments ............................................................. 12
5.3. Test Types ......................................................................... 12
5.4. Test Results ...................................................................... 135.5. Test Tools ......................................................................... 14
5.6. Mercury Quality Center ..................................................... 14
5.7. Defect Reporting & Problem Resolution .............................. 14
5.8. Severity Definitions ........................................................... 15
5.9. Defect Triage .................................................................... 17
5.10. Metrics ............................................................................ 19
6. Moves to the Q Environments .............................................. 20
7. Release Management Process Certification ............................ 22
7.1. Release Management Diagram ........................................... 22
7.2. Certification Build Process ................................................. 22
7.3. Entrance & Exit Criteria ...................................................... 22
7.4. Assumption and Risks ........................................................ 25
TG 3 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
3
8/8/2019 QA Standards 2008
4/30
Appendix I Glossary ................................................................. 26
................................................................................................ 30
TG 4 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
4
8/8/2019 QA Standards 2008
5/30
1. Introduction
The goal of the Enterprise Project Management Office (EMPO) Quality Assurance (QA) team is
to provide the necessary oversight to ensure all software development projects for TG have been
developed and delivered utilizing good software engineering practices.
This includes development for:
(1) Application Support of Platform Convergence, maintenance and repair,
(2) Production, and(3) Next Generation projects.
Application Support includes, at a minimum, testing of fixes, modifications, and the addition ofminor new functionality to existing TG applications using commercial off the shelf products
(COTS) and any integration with third part tools or in house development efforts.
ProductionSupport includes, at a minimum, adding new functionality to the existing code base
and uses the existing infrastructure comprised of the architecture of databases, applications,devices, and connectivity.
Next GenerationSupport includes, at a minimum, two types of support. It includes the testing
of new features and functionality such as network monitoring and database reporting. Some or
all of these require new additions to the underlying infrastructure.
The purpose of this document is to identify standards and processes to be utilized by QA
including the methodology, verification, and validation for application testing.
TG 5 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
5
8/8/2019 QA Standards 2008
6/30
2. Reference Documents
Document
Name
Share point location
TriageProcess
G:\EPMO_Design & TestingServices\EPMO_QA_Team\STANDARDS\Defect TriageProcess.doc
ReleaseManagement Process
To Be Added
DefectManagement Process
Test CaseWritingDocument
IssueManagement
To Be Added
ChangeControlProcess
To Be Added
TG 6 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
6
http://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Defect%20Triage%20Process.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Defect%20Triage%20Process.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Defect%20Triage%20Process.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Defect%20Triage%20Process.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Defect%20Triage%20Process.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Defect%20Triage%20Process.doc8/8/2019 QA Standards 2008
7/30
3. Scope
This document will define the activities and processes employed by QA in support of
development projects and maintenance initiatives as categorized below: Verification and validation support of software project initiatives and maintenance
releases,
QA team management
Support of process initiatives to enhance existing processes as deemed necessary.
TG 7 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
7
8/8/2019 QA Standards 2008
8/30
4. Activities & Deliverables
4.1. Requirements Analysis & Traceability
Typically traceability is performed on the following documents:
Product Requirements,
Functional Specification,
Test Plans and Test Cases
However this list will be specifically defined by the EPMO on a per project basis.
QA will review each of these documents for completeness, correctness, traceability, consistency,applicability, and testability. QA also verifies there is traceability between the documents.
Traceability verifies product requirements are matched with functional requirements and thatthese are in turn matched with Test Cases. In this way QA can ensure complete traceability in
both forward and backward directions.
4.2. Risk Identification & Mitigation
QA will identify and submit project and process risks to the Project Manager (Project Manager)
per project. Although the specific list for each project will be defined by the PM, risks typicallyinclude the following list:
Description of the Risk
Likelihood of occurrence factor
Ability to detect factor
Impact of occurrence factor
TG 8 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
8
8/8/2019 QA Standards 2008
9/30
4.3. Project Milestones
QA testing milestones and activities are incorporated into the overall Project schedule.
Milestone status is reported as the project progresses. Specific milestones for QA testing include:
Milestone/Activity Deliverable
Project Start QA Process Document
Test Strategy
Execution Phase Requirements Traceability
Review or Create Test Plan
Performance Test Plan (if necessary)
Test Cases
Test Review Internal Test Results Audit
UAT Test Plan/Test Cases (if necessary)
Defect Review Test Results/ Defect Count
Certification Review Test Certification Production Readiness
Production Go Live Dry run in Production
Post Mortem Participate in Review
4.4. Records Collections and RetentionAll applicable documents are stored on the share in the Project folder:G:\Corporate Projects
4.5. Test Case Design
Test Case Design Reviews will be held for all new test case creation. Reviews should include
availability of the appropriate design of test case based on a Requirements Document, Wire
Frame, HMTL mock up, or Functional Document. Discussions are held between QA and theBusiness Analysts (BAs) addressing the questions and/or concerns, and identification of issues
and risks. Successful completion of the design review will result in approved test cases.
Test Case Samples
G:\EPMO_Design & Testing Services\EPMO_QA_Team\STANDARDS\Test CaseCreation_08.doc
TG 9 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
9
http://g/Corporate%20Projectshttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Test%20Case%20Creation_08.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Test%20Case%20Creation_08.dochttp://g/Corporate%20Projectshttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Test%20Case%20Creation_08.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/Test%20Case%20Creation_08.doc8/8/2019 QA Standards 2008
10/30
4.6. Test Case Review
Completed test cases are marked ready for reviewin QC by the tester. The Test Cases will be
reviewed and approved by QA Lead on that project. Any issues or comments will be documented
and tracked via QC. The tester repairs the test cases, for review. Upon approval, the test cases are
marked ready for execution in QC. Evaluation criteria will include completeness, traceability,testability, and technical assessment. Supporting documents that are base-lined will are added to
QC as a project artifact.
TG 10 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
10
8/8/2019 QA Standards 2008
11/30
5. TestingTesting is an integral part of the product development process. A test strategy is created for eachproject that is tested by QA. This will cover testing from the lowest practical level up to the user
and customer level. QA will validate that all tests have been executed successfully prior toentering certification testing. QA will be a contributor and approver of test strategies, test plansand automated and/or manual tests cases for integration, system, performance/stress and
regression testing.
The following criteria will be used by QA to review the test plans and test cases.
High Level Requirements traceability
Functional Requirements traceability
Test cases traceability
Test scope and coverage appropriate level of testing for features or aspects of theapplication(s)
Integrity
Correctness of assumptions or pre-conditions
Accuracy expected results of test cases
This data will be collected by QA.
5.1. External Testing (UAT)
UAT will be conducted to ensure the features to be released meet the user and business needs.
All product releases will require participation and approval from internal business user(s) prior to
deployment.
End Users will define and execute user acceptance testing, (UAT) with a defined set of business
users. Upon completion of UAT, QA will audit the test results to ensure success and readiness
for promotion to production.
This external testing will be conducted for all major releases as a final test gate to Production.
TG 11 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
11
8/8/2019 QA Standards 2008
12/30
5.2. Test Environments
A controlled test environment will be used for all QA validation activities. Only QA and the
environments team will have access permissions to test systems and data unless explicitlyapproved by QA for a defined, limited time-period. Any maintenance or changes to the QA test
environment shall be coordinated between QA and the environments team. In addition, QA mayrandomly request login changes if the integrity of the environment is deemed at risk.
Verification and validation of the build process, and the environment setup is performed in the
test environment. It is this environment where both internal and external (UAT) testing takesplace.
Typically the test environment will be a mirror of the target production configuration,
architecture and performance and contain simulated and actual elements including 3rd-partyapplications, proprietary code, servers, and networking devices. Exceptions to the test
environment and the required test data will be determined on a project-basis and defined in the
test strategy document during the Execution phase. The exception to this environment beingmirrored to production will be the advancement of a certification environment.
Issues found during QA testing and UAT will be logged and documented by QA, Development,UAT Tester and stored in Quality Center, (QC).
5.3. Test Types
The following types of testing will be performed by QA depending on theproject;
Functional test Smoke Test
System Test
Regression Test
Performance, Load And Stress Test
User Acceptance Test
Automated Test
5.3.1. Automated Test Architecture
The automated test are created using the automated test architecture guide,
this ensures that all test are created using common design principles,reusable actions and global repositories.
The guide can be found her:G:\EPMO_Design & Testing Services\EPMO_QA_Team\PROCESS\QCAutomation\QC Automation Architecture.doc
TG 12 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
12
http://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/PROCESS/QC%20Automation/QC%20Automation%20Architecture.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/PROCESS/QC%20Automation/QC%20Automation%20Architecture.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/PROCESS/QC%20Automation/QC%20Automation%20Architecture.dochttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/PROCESS/QC%20Automation/QC%20Automation%20Architecture.doc8/8/2019 QA Standards 2008
13/30
5.4. Test Results
The QA team will provide test results for all automated and manual results conducted during
testing. Results contain, at minimum, the following information:
Automated
Name of test suite
Name of each test included in test suite
Day, month, year and time of test run
Results of test suite specifying total number of tests passed, failed, and not-run
at this time
Length of run either of individual test or test suite
Build version tested
If a Change Request (CR) is created due to a failed test case, the number of
the test case will be included in the CR
Manual
Name of Development tester
Date of test
Description of functionality being tested. If these do not match thedescriptions in this document, a matrix will be provided with the test results
Results of each test
Build version tested
If a CR is created based on a test case, the test case number will be included
The results of running each test including error message(s) when appropriate
Any issues found in the test results will be documented by QA in the defect tracking tool.
TG 13 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
13
8/8/2019 QA Standards 2008
14/30
5.5. Test Tools
Test tools will be utilized wherever possible to improve testing efficiency and effectiveness.
Where appropriate, automated tools will be utilized to helpsupport and augment system testing.
The Test Tool Table below lists specific test tools to be used, dependant upon availability, forautomation/execution and version control of all test documents and scripts.
Tool Purpose Test Activity/Deliverable
Mercury Quality Center
(QC)
Repository Requirements traceability
Mercury Quality Center
(QC)
Test Case Creation System Test Cases
Mercury Quality Center
(QC)
Test Case Execution System Test Case Execution
Mercury Quality Center
(QC)
Defect Tracking Tracking of Defects
LoadRunner Performance Testing Load/ Stress Testing
Quick Test Pro (QTP) Automation Automation of System Test Cases
5.6. Mercury Quality Center
Mercury Quality Center is administered by the QA team. All projects aremanaged using this tool. In order to gain access to the tool and the project
that you are working, you must request access [email protected].
Classes are provided for those not familiar with QC.
5.7. Defect Reporting & Problem Resolution
The defect tracking tool, Mercury Quality Center, will be used for defect collection, tracking,
closure, and reporting. Defects found in the integration test environment will be documented byQA. Defects will be logged by Development, during Unit Test in Development environment.
UAT Defects will be logged by the RAs and UAT testers, in the appropriate environment.
Defects will be classified by Priority and Severity. Severity is the impact of the defect to systemfunctionality or performance. Priority is the business teams need to fix the defect prior to
shipment.
QA will participate in managing, prioritizing, and resolving defects. A report of unresolved
issues will be generated upon the completion of Integration, QA & User Acceptance testing and
shall be reviewed and prioritized by the Triage Team.
TG 14 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
14
mailto:[email protected]:[email protected]8/8/2019 QA Standards 2008
15/30
All defects found shall be recorded including the following information:
1) Steps to reproduce including authentication & URL information2) Expected results
3) Actual Results
4) Assign Severity according to the definitions below
Defect Process for QC is here:G:\EPMO_Design & Testing Services\EPMO_QA_Team\STANDARDS\QC DefectStatus.vsd
5.8. Severity Definitions
All defects shall be assigned one of the following severities. When conflicts arise in assigning
severity to defects, QA will contact business representatives (Service Delivery, ProductMarketing & Sales) for resolution.
Severity 1- A crash, security breach, 500 or 404 web errors, missing functionality, severe
usability issues causing misuse of tool, etc.
Severity 2- Severe bug with no work around, bad or wrong data, bug that wont allow
testing to proceed and/or blocks functionality
TG 15 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
15
http://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/QC%20Defect%20Status.vsdhttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/QC%20Defect%20Status.vsdhttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/QC%20Defect%20Status.vsdhttp://g/EPMO_Design%20&%20Testing%20Services/EPMO_QA_Team/STANDARDS/QC%20Defect%20Status.vsd8/8/2019 QA Standards 2008
16/30
Severity 3- General bugs, that can be worked around or are not impeding functionality or
testing
Severity 4- Cosmetic issues, very minor bugs
TG 16 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
16
8/8/2019 QA Standards 2008
17/30
5.9. Defect Triage
Defects logged are triaged for priority by the triage team in the followingprocess:
Bug Triage severity 1-4
The Triage process is comprised of the following phases.
* Required Information
1. Defect is opened and has a severity*.
2. All defects to have a value in the Severity* field when created.
3. The defect triage team may update the Severity* and Priority field values as necessary for thetickets selected for evaluation
4. The creator of a new defect will still assign the value ofNew to the ticket Status field.
5. The creator of the new defect will assign the new ticket considering the following criteria:
a. If the ticket can be clearly assignedto a feature-set team: assign the ticket to the leadsoftware-engineer keeping the New status
b. If the ticket clearly needs to be assigned to the Development group, but not certain towhich feature-set team or the problem is global in nature: assign the defect to one of
the two Lead Application Architects keeping the New status
c. If there is a problem in identifying the problem for several reasons: assign to thedefect triage team keeping the New status
6. All New tickets are assigned to Responsible Party. (Development, P&D, QA)
7. The defect triage team will analyze defects assigned / reassign to them either by the assignmentor by scanning defects as determined by the team for analysis
8. The Lead Architects or the Feature-set leads may assign tickets to the defect triage team
when clarifications that prevent development to continue are required.
9. If incident is a Severity 1:
a. Email notification is provided to Triage Team to ensure that the Defect is on the
radar.
b. Defect is assigned to Dev
c. Dev completes repair
d. QA retest defect
e. Defect is closed or re-opened
10. If incident is a Severity 2 or 3:
a. Email notification is provided to Triage Team to ensure that the Defect is on the
radar.
b. Defect is assigned to Dev
c. Dev completes repair
TG 17 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
17
8/8/2019 QA Standards 2008
18/30
d. QA retest defect
e. Defect is closed or re-opened
TG 18 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
18
8/8/2019 QA Standards 2008
19/30
5.10. Metrics
Metrics will be collected on a per project basis. Metrics will be used for assessing the
effectiveness of the test planning, test execution processes, defect discovery, defect impacts, anddefect repair processes.
Reporting of test metrics will occur in real-time during the test phase or at the end of project, asdeemed appropriate. This data will be collected by QA and will include:
Test coverage factors (dependant on available test tools)
Test Execution Rates
Defect discovery and fix rates by severity
Quantity of defects by status
Defect aging by severity
Total Defects Open & Closed
Success rate of releases per project
Defect removal efficiency Requirements leakage percentage
Metrics will be captured from the QC Dashboard and stored in each project folder on the share.
TG 19 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
19
8/8/2019 QA Standards 2008
20/30
6. Moves to the Q EnvironmentsAll code moving into any QA environment (environments starting with Q) must follow
the flow below.
o The flow puts all changes to an environment in the hand of a requestor
(Developer, Service Center Ticket (SCT), Work Item Request (WIR), Tester, and
Project Manager).
This ensures that all versions, including backmigratedcode is available.
This ensures that projects that are scheduled for the same delivery date are
identified
This ensures that WIRs or SCT are not being duplicated by an ongoing
project effort
This ensures that warranty work is accounted for and will be integrated
with the correct code base.
o Each change to an environment mustbe emailed to Dons Group.
o Each IDMS change must be mailed to the Database group
o Dons Group works daily with QA to ensure all environments are up and with the
correct information Dons group coordinates the effort to get the change into theQ Environment at the right time and the right version.
o If the wrong version is attempted to be placed into Q the move is rejected and
the requestor is notified (by environments or the DBAs) to make modifications tothe request.
o Once the move is approved, the Environments Team or the DBA team makes the
changes to the Q environment
o All parties are notified and testing can resume
QA will provide support to any requestor attempting to make moves into the Q
environments but will notbe filling out move sheets or ensuring the validity of the
code prior to moving into testing.
Project Managers are asked to include a line item for environmental changes into their
project plans, since swapping of environments are not uncommon and require some
down time.
TG 20 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
20
8/8/2019 QA Standards 2008
21/30
TG 21 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
21
8/8/2019 QA Standards 2008
22/30
7.Release Management Process CertificationThe Release Management Process ensures the implementation meets user expectations prior to
deployment into production.
7.1. Release Management Diagram
Being Designed
7.2. Certification Build Process
Upon successful completion of entry criteria, XXX will checkout the new/updated code
and executes a build in the Pre-prod environment. This build is validated by the XXX
Team using a pre-defined set of validation steps provided in the deployment documentprovided by the development team.
If the build is successful, the next phase of testing may begin. If a buildfails, QA will notify all teams of the failed units. The build will not enterProduction environment unless all issues found during the buildprocess have been resolved and the code retagged.
7.3. Entrance & Exit Criteria
Below are tables of entry and exit criteria per the source and target environment(s). All criteria
must be met prior to proceeding to the next phase or environment. Any deviations from the
criteria must be approved by the executive change control board.
Entry Criteria to Enter Q Testing Environment
Criteria Owner
Functional Specifications complete and approved BA
Design Specification Documents complete and approved BA/Dev
Code is functionally complete Dev
Code checked-in & Build available Dev
Test Strategy complete & approved QA
Test Cases complete QA
Successful completion of QA tests w/documented test results QA
Successful Audit of test plan/cases and results: QA
Code Freeze: Label is applied to tested code Dev
TG 22 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
22
8/8/2019 QA Standards 2008
23/30
Entry Criteria to Enter Pre-Pro Testing Environment
Criteria Owner
Successful completion of Integration Test Entry Criteria QA
Build to Test environment is completed with no errors Support
Smoke tests of QA build are executed with no errors QA
Acceptance sub-set of automated and manual tests are executed
successfully in QA
QA
All Sev-1 and Sev-2 defects have root cause identified QA & DEV
All Sev-1 and Sev-2 defects are resolved or an action plan is in
place
QA / BA
All known issues and open defects are documented in ReleaseNotes.
Any Manual configurations needed after implementation into
production is specified here.
QA & DEV
Internal test results documented and distributed QA
External (UAT) tests defined Business
Test Readiness Review completed successfully QA
Entry Criteria to Enter Production
Criteria Owner
Build to Certification Test environment is completed with noerrors
IT
Smoke tests of QA build execute with no errors QA
UAT cases are executed successfully QA/Business
All Sev-1 and Sev-2 defects have root cause identified QA/DEV
All defects with a high priority have been resolved or an action
plan is in place
QA/BA
Test results documented and distributed QA
All appropriate issues and open key defects are documented in
the Release Notes
Any Manual configurations needed after implementation into
DEV
TG 23 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
23
8/8/2019 QA Standards 2008
24/30
Entry Criteria to Enter Production
Criteria Owner
production is specified here.
Production Readiness Review Completed Successfully QA
TG 24 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
24
8/8/2019 QA Standards 2008
25/30
7.4. Assumption and Risks
On-time completion of all projects is contingent upon the following assumptions and associated
risks:
Assumptions Risks
Test environment(s) for Internal/External testing
are purchased and correctly configured
Unavailability of environment(s) will block testing
thus preventing timely deployment to production
Requirements & Design documents will be
completed and approved per the project plan
Modifications to deliverables and /or their due
dates will impact the QA & test ability to complete
milestones/activities
Necessary resources (people & tools) will be
available upon completion of development
Resource constraints will extend timeframe for test
and QA milestones/activities
Build Deployments are consistent Inconsistent build deployments will delay the testeffort and push the delivery date
All Internal & External testing must be complete Planning of parallel testing in same environment
or by both QA and UAT test will cause schedule
slippage
TG 25 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
25
8/8/2019 QA Standards 2008
26/30
Appendix I Glossary
This list represents terms used at TG:
Automated Test(s)
The term automated testin its most generic use implies a test that has been written in
some computer or scripting language to programmatically perform some number of
testing steps and does not carry with it any additional information about scope, function,
or environment.
To carry more meaning, the scope and the environment is added:Automated Integration
Tests in the Test Environment,Automated Regression Tests in Production.
Accessibility Testing
Accessibility is a general term used to describe the degree to which a product
(e.g., device, service, environment) is accessible by as many people as possible.
o
Americans with Disabilities Act of 1990o Section 508 Amendment to the Rehabilitation Act of 1973
Manual Test(s)
This is a description of a test with no automated steps and everything must be done by
hand.
Tag
When collecting files in preparation for deployment a build label is used to group
them together. It contains an enumerated value used as a release designator, as
well as file names, versions, and file locations in VSS.
Change Request (CR)Change Requests are defects that have been found at any time up to and ending
with the deployment into production. CRs are numbered sequentially, are issued a
severity, given a description, assigned to a responsible party, have steps to
reproduce, and move through various statuses such as: new, open, fixed, and
verified fixed.
Deployment
This is the process also known as release in which the developers prepare theset of source code files in VSS to be included in the release by building a label
that references the latest versions of each individual file to be released, and then
provide instructions to the Support team, who then uses the label to move thelisted set of files into a new environment.
TG 26 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
26
8/8/2019 QA Standards 2008
27/30
Development Environment
This is the environment in which the development team works closely with the
business partners, project managers, and the QA team to(1) translate the Requirements, Functional Specifications, System Design,
(2) write Unit test to verify the code,
(3) run those tests,
(4) prepare the code for release into a test environment, and
(5) fix those defects found during deployment and testing.
The infrastructure of this environment represents only some fraction of the actualproduction environment.
GUI Testing
GUI software testing is the process of verifying a product that uses a graphical
user interface, to ensure it meets its written specifications. This is normally done
by executing a variety of test cases.
Load Testing
Load testing is the process of creating demand on a system or device andmeasuring its response.
Integration Testing
Integration testing is the phase of software testing in which individual software
modules are combined and tested as a group. It followsunit testing and precedessystem testing.
Integration testing takes as its input modules that have been unit tested, groups
them in larger aggregates, applies tests defined in an integrationtest plan to thoseaggregates, and delivers as its output the integrated system ready for system
testing.
Production Environment
This is the environment where the end users make use of the software product ona daily basis while performing mission critical business functions with and for
customers.
TG 27 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
27
8/8/2019 QA Standards 2008
28/30
Requirements Document
This document describes the software project at the highest level and may be
written by a Requirements Analysts. It may describe the project using business
cases, or objectives. It may compare and contrast competitors using terms such as
market segmentation or the competitive landscape and it may describe functional
requirements in terms of customer preference and usability with the inclusion of
Use Cases.
Regression Test
Regression testing is any type of sofware testing which seeks to uncover
regression bugs. Regression bugs occur whenever software functionality that
previously worked as desired, stops working or no longer works in the same waythat was previously planned. Typically regression bugs occur as an unintended
consequence of program changes.
Peer Review
This is the process in which selected team members review and comment on
various documents, tests, test results, and deployments using a pre-designedformat.
QA Department
This Quality Assurance department at TG contains that group of people who are
both responsible for executing the tests against applications, projects and
environments at TG.
Deployment DocumentThe Deployment Document is prepared in preparation for a deployment of a
software project into the production environment. This document includes an
itemized list of steps to follow in order to deploy the project, steps to verify and
steps to rollback in the event of a deployment failure.
Functional Specification Document
This document contains the following sections: Theproblem definition may
include the functionality as seen through end-user scenarios, or in relationship to
other components and levels of the hierarchy. The software architecture is also
described at a high level including relationships with other major components andinterfaces with external hardware and software. They may also include memory
and performance requirements and an estimate of additional resources to satisfy
the requirements.
Performance Testing
Performance testing is testing that is performed, from one perspective, to
TG 28 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
28
8/8/2019 QA Standards 2008
29/30
determine how fast some aspect of a system performs under a particular
workload. It can also serve to validate and verify other quality attributes of the
system, such as scalability, reliability and resource usage performance testing is
testing that is performed, from one perspective, to determine how fast some aspect
of a system performs under a particular workload. It can also serve to validate and
verify other quality attributes of the system, such as scalability, reliability andresource usage
System Functional Specifications Document
This document contains a high level list description of functional requirements of
the top-level of the architectural design as seen by the end user. This set of
specifications describes how all of the components of the system fit together and
communicate; however, they do not specify the individual components functional
requirements nor how any of the system components will be built. The functions
may be characterized as behavior and described in terms of how the system
enables the end user to perform work (i.e., the functions, performance, quality
requirements, etc). This architectural design describes the components of the
system, their individual functions and performance, and how they interface with
each other.
Stress Testing
This is a form of testing that is used to determine the stability of a given system or
entity. It involves testing beyond normal operational capacity, often to a breaking
point, in order to observe the results.
Security Testing
The Process to determine that data is maintained with functionality as intended.
The six basic security concepts that need to be covered by security testing are:confidentiality, integrity, authentication, authorisation, availability and non-
repudiation.
Test Strategy
Test Strategyis a systematic approach to testing a system. The plan typically
contains a detailed understanding of what the workflow.
Test Case
Test case is a set of conditions or variables under which a tester will determine if a
requirement or use case upon an application is partially or fully satisfied. It may
take many test cases to determine that a requirement is fully satisfied.
Unit Test
Unit testing is a procedure used to validate that individual units of source code are
working properly. A unit is the smallest testable part of an application.
TG 29 Company ConfidentialA printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
29
8/8/2019 QA Standards 2008
30/30
Smoke Test
Smoke tests are a set of tests performed immediately after the project has been
deployed into a new environment. These tests confirm the success of thedeployment. They typically contain some representative tests from multiple
categories as defined below, integration, system or regression tests. They may
also contain simple confirmations of look and feel, connectivity, or operability.
Q Environment
This is the environment in which automated and manual tests of all types are
performed by developers, testers, and end-users. This environment resembles the
production environment as closely as possible and contains all the major
components and functionality of the production environment.
QC
Quality Center by Mercury is used to collect requirements, test planning, test
cases, execute test cases, track and report on defects.
User Acceptance Test
User Acceptance Testing (UAT) is a process to obtain confirmation by a Subject
Matter Expert (SME), preferably the owner or client of the object under test,
through trial or review, that the modification or addition meets mutually agreed-
upon requirements. In Software Development, UAT is one of the final stages of aproject and often occurs before a client or customer accepts the new system.