49
VIKRAM Software Testing: Testing is a process of executing a program with the intent of finding error. Software Engineering: Software Engineering is the establishment and use of sound engineering principles in order to obtain economically software that is more reliable and works efficiently on real machines. Software engineering is based on Computer Science, Management Science, and Economics, Communication Skills and Engineering approach. What should be done during testing? Confirming product as Product that has been developed according to specifications Working perfectly Satisfying customer requirements Why should we do testing? Error free superior product Quality Assurance to the client Competitive advantage Cut down costs How to test? Testing can be done in the following ways: Manually Automation (By using tools like WinRunner, LoadRunner, Test Director …) Combination of Manual and Automation. Software Development Phases: Information Gathering: It encompasses requirements gathering at the strategic business level. Planning: To provide a framework that enables the management to make reasonable estimates of

Manual Testing

Embed Size (px)

DESCRIPTION

Hi This is kiran.i am sharing my testing knowledge here.if any updates let me know..........

Citation preview

Page 1: Manual Testing

VIKRAM

Software Testing: Testing is a process of executing a program with the intent of finding error.

Software Engineering: Software Engineering is the establishment and use of sound engineering principles in order to obtain economically software that is more reliable and works efficiently on real machines.

Software engineering is based on Computer Science, Management Science, and Economics, Communication Skills and Engineering approach.

What should be done during testing?Confirming product as

Product that has been developed according to specifications Working perfectly Satisfying customer requirements

Why should we do testing? Error free superior product Quality Assurance to the client Competitive advantage Cut down costs

How to test?Testing can be done in the following ways:

Manually Automation (By using tools like WinRunner, LoadRunner, Test Director …) Combination of Manual and Automation.

Software Development Phases:Information Gathering: It encompasses requirements gathering at the strategic business level.

Planning: To provide a framework that enables the management to make reasonable estimates of

Resources Cost Schedules Size

Requirements Analysis: Data, Functional and Behavioral requirements are identified.

Data Modeling: Defines data objects, attributes, and relationships. Functional Modeling: Indicates how data are transformed in the system. Behavioral Modeling: Depicts the impact of events.

Page 2: Manual Testing

VIKRAM

Design: Design is the engineering representation of product that is to be built.

Data Design: Transforms the information domain model into the data structures that will be required to implement the software.

Architectural design: Relationship between major structural elements of the software. Represents the structure of data and program components that are required to build a computer-based system.

Interface design: Creates an effective communication medium between a human and a computer.

Component level Design: Transforms structural elements of the software architecture into a procedural description of software components.

Coding: Translation into source code (Machine readable form)

Testing: Testing is a process of executing a program with the intent of finding error

Unit Testing: It concentrates on each unit (Module, Component…) of the software as implemented in source code.

Integration Testing: Putting the modules together and construction of software architecture.

System and Functional Testing: Product is validated with other system elements are tested as a whole

User Acceptance Testing: Testing by the user to collect feedback.

Maintenance: Change associated with error correction, adaptation and enhancements.

Correction: Changes software to correct defects. Adaptation: Modification to the software to accommodate changes to its external

environment. Enhancement: Extends the software beyond its original functional requirements.

Prevention: Changes software so that they can be more easily corrected, adapted and enhanced.

BRS: Consists of definitions of customer requirements. Also called as CRS/URS

Page 3: Manual Testing

VIKRAM

S/wRS: Consists of functional requirements to develop and system requirements(s/w & H/w) to use.

Review: A verification method to estimate completeness and correctness of documents.

HLDD: Consists of the overall hierarchy of the system in terms of modules.

LLDD: Consists of every sub module in terms of Structural logic (ERD) and Backend Logic (DFD)

Prototype: A sample model of an application without functionality is called as prototype (Screens)

WBT: A coding level testing technique to verify completeness and correctness of the programs. Also called as Glass BT or Clear BT

BBT: It is an .exe level of testing technique to validate functionality of an application with respect to customer requirements. During this test engineer validate internal processing depends on external interface.

Verification: whether system is right or wrong?

Validation: whether system is right system or not?

Test Scenario: Is what the test is going to do, we can also use TEST Description instead of it.

Quality:

Meet customer requirements Meet customer expectations (cost to use, speed in process or performance,

security) Possible cost Time to market

For developing the quality software we need LCD and LCT

LCD: A multiple stages of development stages and the every stage is verified for completeness.

V model:

Build: When coding level testing over. It is a completely integration tested modules. Then it is called a build. Build is developed after integration testing. (.exe)

Page 4: Manual Testing

VIKRAM

Test Management: Testers maintain some documents related to every project. They will refer these documents for future modifications.

Port Testing: This is to test the installation process.

Change Request: The request made by the customer to modify the software.

Defect Removal Efficiency:DRE= a/a+b. a = Total no of defects found by testers during testing.b = Total no of defects found by customer during maintenance.

DRE is also called as DD (Defect Deficiency).

BBT, UAT and Test management process where the independent testers or testing team will be involved.

Refinement form of V-Model: Due to cost and time point of view v-model is not applicable to small scale and medium scale companies. These types of organizations are maintaining a refinement form of v-model.

Assessment of Development PlanPrepare Test PlanRequirements Phase TestingInformation Gathering

& Analysis

Design Phase TestingProgram Phase Testing (WBT)Design and Coding

Functional & System TestingUser Acceptance TestingTest Environment Process

Install Build

Port TestingTest Software ChangesTest Efficiency

Maintenance

Page 5: Manual Testing

VIKRAM

Fig: Refinement Form of V-Model

Development starts with information gathering. After the requirements gathering BRS/CRS/URS will be prepared. The Business Analyst does this.

During the requirements analysis all the requirements are analyzed at the end of this phase S/wRS is prepared. It consists of the functional (customer requirements) + System Requirements (h/w + S/w) requirements. It is prepared by the system analyst.

During the design phase two types of designs are done. HLDD and LLDD. Tech Leads will be involved.

During the coding phase programmers develop programs.

During unit testing, they conduct program level testing with the help of WBT techniques.

During the Integration Testing, the testers and programmers or test programmers integrating the modules to test with respect to HLDD.

During the system and functional testing the actual testers are involved and conduct tests based on S/wRS.

During the UAT customer site people are also involved, and they perform tests based on the BRS.

From the above model the small scale and medium scale organizations are also conducts life cycle testing. But they maintain separate team for functional and system testing.

BRS/URS/CRS

S/wRS

HLDD

LLDD

Code

Unit Testing

Integration Testing

Functional & System Testing

User Acceptance Testing

Page 6: Manual Testing

VIKRAM

Reviews during Analysis: Quality Analyst decides on 5 topics after completion of information gathering and analysis a review meeting conducted to decide following 5 factors.

1. Are they complete? 2. Are they correct? 3. Are they achievable? 4. Are they reasonable? (With respect to cost & time)5. Are they testable?

Reviews during Design:After the completion of analysis of customer requirements and their reviews, technical support people (Tech Leads) concentrate on the logical design of the system. In this every stage they will develop HLDD and LLDD.

After the completion of above like design documents, they (tech leads) concentrate on review of the documents for correctness and completeness. In this review they can apply the below factors.

Is the design good? (Understandable or easy to refer) Are they complete? (All the customer requirements are satisfied or not) Are they correct? (The design flow is correct or not) Are they following able? (The design logic is correct or not) Do they handle error handling? (The design should be able to specify the positive

and negative flow also)

Unit Testing:After the completion of design and their reviews programmers are concentrating on coding. During this stage they conduct program level testing, with the help of the WBT techniques. This WBT is also known as glass box testing or clear box testing.

WBT is based on the code. The senior programmers will conduct testing on programs WBT is applied at the module level.

There are two types of WBT techniques, such as

User Login Inbox

User Information

Invalid User

Page 7: Manual Testing

VIKRAM

1. Execution Testing Basis path coverage (correctness of every statement execution.) Loops coverage (correctness of loops termination.) Program technique coverage (Less no of Memory Cycles and CPU

cycles during execution.)

2. Operations Testing: Whither the software is running under the customer expected environment platforms (such as OS, compilers, browsers and etc…sys s/w.)

Integration Testing: After the completion of unit testing, development people concentrate on integration testing, when they complete dependent modules of unit testing. During this test programmers are verifying integration of modules with respect to HLDD (which contains hierarchy of modules).

There are two types of approaches to conduct Integration Testing:

Top-down Approach Bottom-up approach.

Stub: It is a called program. It sends back control to main module instead of sub module.Driver: It is a calling Program. It invokes a sub module instead of main module.

Top-down: This approach starts testing, from the root.

Bottom-Up: This approach starts testing, from lower-level modules. Drivers are used to connect the sub modules. (Ex login, create driver to accept default uid and pwd)

Main

Sub Module1

Sub Module2

Stub

Main

Sub Module1

Sub Module2

Driver

Page 8: Manual Testing

VIKRAM

Sandwich: This approach combines the Top-down and Bottom-up approaches of the integration testing. In this middle level modules are testing using the drivers and stubs.

System Testing: After the completion of Coding and that level tests (U & I) development team releases a finally integrated all modules set as a build. After receiving a stable build from development team, separate testing team concentrate on functional and system testing with the help of BBT.

This testing is classified into 4 divisions.

Usability Testing (Ease to use or not. Low level Priority in Testing) Functional Testing (Functionality is correct or not. Medium Priority in Testing) Performance Testing (Speed of Processing. Medium Priority in Testing) Security Testing (To break the security of the system. High Priority in Testing)

From the testers point of view functional and usability tests are important.

Usability Testing: User friendliness of the application or build. (WYSIWYG.) Usability testing consists of following subtests also.

I. User Interface Testing

Ease of Use (understandable to end users to operate)

Look & Feel (Pleasantness or attractiveness of screens)

Speed in interface (Less no. Of events to complete a task.)

Main

Sub Module1

Sub Module2

Driver

Sub Module3

Stub

Page 9: Manual Testing

VIKRAM

II. Manual Support Testing: In general, technical writers prepares user manuals after completion of all possible tests execution and their modifications also. Now a days help documentation is released along with the main application.

Help documentation is also called as user manual. But actually user manuals are prepared after the completion of all other system test techniques and also resolving all the bugs.

Functional testing: During this stage of testing, testing team concentrate on " Meet Customer Requirements". For performing what functionality, the system is developed met or not can be tested.

For every project functionality testing is most important. Most of the testing tools, which are available in the market, are of this type.

The functional testing consists of following subtests

Functionality or Requirements Testing: During this subtest, test engineers validates 3correctness of every functionality in our application build, through below coverage.If they have less time to do system testing, they will be doing Functionality Testing only.

Functionality or Requirements Testing has following coverage:

Behavioral Coverage (Object Properties Checking). Input Domain Coverage (Correctness of Size and Type of every i/p Object). Error Handling Coverage (Preventing negative navigation). Calculations Coverage (correctness of o/p values). Backend Coverage (Data Validation & Data Integrity of database tables). Service Levels (Order of functionality or services). Successful Functionality (Combination of above all).

All the above coverage is mandatory or must.

User Interface Testing

Remaining System Testing techniques like Functionality, Performance and Security Tests

Manual Support Testing

System Testing

Development Team releases Build

Page 10: Manual Testing

VIKRAM

Input Domain Testing: During this test, the test engineer validates size and type of every input object. In this coverage, test engineer prepares boundary values and equivalence classes for every input object.

Boundary Value analysis: Boundary values are used for testing the size and range of an object.

Equivalence Class Partitions: Equivalence classes are used for testing the type of the object.

Recovery Testing: This test is also known as Reliability testing. During this test, test engineers validate that, whether our application build can recover from abnormal situations or not.

Compatibility Testing: This test is also known as portable testing. During this test, test engineer validates continuity of our application execution on customer expected platforms( like OS, Compilers, browsers, etc..)

During this compatibility two types of problems arises like1. Forward compatibility2. Backward compatibility

Forward compatibility:The application which is developed is ready to run, but the project technology or environment like OS is not supported for running.

Backward compatibility:The application is not ready to run on the technology or environment.

Abnormal

Backup & Recovery Procedures

Normal

Build OS

Page 11: Manual Testing

VIKRAM

Configuration Testing: This test is also known as Hardware Compatibility testing. During this test, test engineer validates that whether our application build supports different technology i.e. hardware devices or not?

Inter Systems Testing: This test is also known as End-to-End testing. During this test, test engineer validates that whither our application build coexistence with other existing software in the customer site to share the resources (H/w or S/w).

Installation Testing: Testing the applications, installation process in customer specified environment and conditions.

The following conditions or tests done in this installation process

Setup Program: Whither Setup is starting or not?

Easy Interface: During Installation, whither it is providing easy interface or not?

Occupied Disk Space: How much disk space it is occupying after the installation?

Sanitation Testing: This test is also known as Garbage Testing. During this test, test engineer finds extra features in your application build with respect to S/w RS.Maximum testers may not get this type of problems.

Parallel or Comparitive testing: During this test, test engineer compares our application build with similar type of applications or old versions of same application to find competitiveness.

This comparative testing can be done in two views: Similar type of applications in the market. Upgraded version of application with older versions.

Build OS

Build +Required

S/w components to

run application

Customer Site Like

Environment

Installation 1. Setup Program

2. Easy Interface

3. Occupied Disk Space

Page 12: Manual Testing

VIKRAM

Performance Testing: It is an advanced testing technique and expensive to apply. During this test, testing team concentrate on Speed of Processing.

This performance test classified into below subtests.

1. Load Testing2. Stress Testing3. Data Volume Testing4. Storage Testing

Load Testing:This test is also known as scalability testing. During this test, test engineer

executes our application under customer expected configuration and load to estimate performance.

Load: No. of users try to access system at a time.

This test can be done in two ways

1. Manual Testing. 2.By using the tool, Load Runner.

Stress Testing: During this test, test engineer executes our application build under

customer expected configuration and peak load to estimate performance.

Data Volume Testing: A tester conducts this test to find maximum size of allowable or

maintainable data, by our application build.

Storage Testing: Execution of our application under huge amounts of resources to estimate

storage limitations to be handled by our application is called as Storage Testing.

Security Testing: It is also an advanced testing technique and complex to apply.To conduct this tests, highly skilled persons who have security domain knowledge.

This test is divided into three sub tests.

Authorization: Verifies author’s identity to check he is an authorized user or not.

+

=

--

Resources

PerformanceTrashing

Page 13: Manual Testing

VIKRAM

Access Control: Also called as Privileges testing. The rights given to a user to do a system task.

Encryption / Decryption: Encryption- To convert actual data into a secret code which may not be understandable to others. Decryption- Converting the secret data into actual data.

User Acceptance Testing: After completion of all possible system tests execution, our organization concentrate on user acceptance test to collect feed back. To conduct user acceptance tests, they are following two approaches like Alpha Test and Beta Test.

Note: In s/w development projects are two types based on the products like software application (also called as Project) and Product.

Software Application (Project): Get requirements from the client and develop the project. This software is for only one company. And has specific customer. For this Alpha test will be done.

Product: Get requirements from the market and develop the project. This software may have more than one company. And has no specific customer. For this β- Version or Trial version will be released in the market to do Beta test.

Client Server

Source Encryption Decryption Destination

Destination Decryption Encryption Source

For what software applications applicable to specific customerBy real customerIn development siteVirtual environmentCollect Feedback.

For software products. By customer site like people.In customer site like environment.Real environment.Collect Feedback.

Alpha Testing Beta Testing

Page 14: Manual Testing

VIKRAM

Testing during Maintenance: After the completion of UA Testing, our organization

concentrates on Release Team (RT) formation. This team conducts Port Testing in customer site, to estimate completeness and correctness of our application installation.

During this Port testing Release team validate below factors in customer site:

Compact Installation Overall Functionality Input device handling Output device handling Secondary Storage Handling OS Error handling Co-existence with other Software

The release team does the above tests. After the completion of above testing, the Release Team will gives training and application support in customer site for a period.

During utilization of our application by customer site people, they are sending some Change Request (CR) to our company. When CR is received the following steps are done Based on the type of CR there are two types,

1. Enhancement 2. Missed Defect

Change Control Board: It is the team that will handle customer requests for enhancement changes.

Testing Stages Vs Roles:

Enhancement

Impact Analysis CCB

Perform that change

Test that S/w Change

Missed Defect

Impact Analysis

Perform that change

Review old test process capability to improve

Test that S/w Change

Change Request

Page 15: Manual Testing

VIKRAM

Reviews in Analysis – Business Analyst / Functional Lead.Reviews in Design – Technical Support / Technical Lead.Unit Testing – Senior Programmer.Integration Testing – Developer / Test Engineer.Functional & System Testing – Test Engineer.User Acceptance Testing – Customer site people with involvement of testing team.Port Testing – Release Team.Testing during Maintenance – Change Control Board

Testing Terminology:-

Monkey / Chimpanzee Testing: The coverage of main activities only in your application during testing is called as monkey testing.

Exploratory Testing: Level by level of activity coverage of activities in your application during testing is called exploratory testing.

Sanity Testing: This test is also known as Tester Acceptance Test (TAT). They test for whither developed team build is stable for complete testing or not?

Smoke Testing: An extra shakeup in sanity testing is called as Smoke Testing. Testing team rejects a build to development team with reasons, before start testing.

Be bugging: Development team release a build with known bugs to testing them.

Bigbang Testing: A single state of testing after completion of all modules development is called Bigbang testing. It is also known as informal testing.

Incremental Testing: A multiple stages of testing process are called as incremental testing. This is also known as formal testing.

Manual Vs Automation: A tester conducts a test on application without using any third party testing tool. This process is called as Manual Testing. A tester conducts a test with the help of software testing tool. This process is called as Automation.

Need for Automation:

Development Team Released Build

Sanity Test / Tester Acceptance Test

Functional & System Testing

Page 16: Manual Testing

VIKRAM

When tools are not available they will do manual testing only. If your company already has testing tools they may follow automation.

For verifying the need for automation they will consider following two types:

Impact of the test: It indicates test repetition Impact of the test: It indicates test

repetition. Criticality: Load testing, for 1000 users. Criticality indicates complex to apply that test manually. Impact indicates test repetition.

Retesting: Re execution of our application to conduct same test with multiple test data is called Retesting.

Regression Testing: The re execution of our test on modified build to ensure bug fix work and occurrences of side effects is called regression testing.

Any dependent modules may also cause side effects.

Selection of Automation: Before starting one project level testing by one separate testing team, corresponding project manager or test manager or quality analyst defines the need of test automation for that project depends on below factors.

Type of external interface: GUI – Automation. CUI – Manual.

Size of external interface: Size of external interface is Large – Automation.Size of external interface is small – Manual.

Expected No. of Releases: Several Releases – Automation. Less Release – Manual.

Maturity between expected releases: More Maturity – Manual. Less Maturity – Automation.

Tester Efficiency: Knowledge of automation on tools to test engineers – Automation.No Knowledge of automation on tools to test engineers – Manual.

Support from Senior Management: Management accepts – Automation.Management rejects – Manual.

Build

Modified Build

Impacted Passed Tests

Failed Tests

10 Tests Passed

11 Test FailDevelopment

Page 17: Manual Testing

VIKRAM

Testing Policy

Test Strategy

Test Methodology

Test Plan

Test Cases

Test Procedure

Test Script

Test Log

Defect Report

Test Summary Report

Company Level

Project Level Test Lead, Test Engineer

Test Lead

Test Manager/ QA / PM

C.E.O

Test Lead

Page 18: Manual Testing

VIKRAM

Address

Testing Definition: Verification & Validation of S/wTesting Process: Proper Test Planning before start testingTesting Standard: 1 Defect per 250 LOC / 1 Defect per 10 FPTesting Measurements: QAM, TMM, and PCM.

CEO Sign

QAM: Quality Assessment MeasurementsTMM: Test Management MeasurementsPCM: Process Capability Measurements

Test Strategy: 1. Scope & Objective: About need for testing in your organisation2. Business Issues: Budget Controlling for testing3. Test approach: defines the testing approach between development stages and

testing factors.4. Test environment specifications: Required test documents developed by testing

team during testing.5. Roles and Responsibilities: Defines names of jobs in testing team with required

responsibilities.6. Communication & Status Reporting: Required negotiation between two

consecutive roles in testing.7. Testing measurements and metrics: To estimate work completion in terms of

Quality Assessment, Test management process capability.8. Test Automation: Possibilities to go test automation with respect to corresponding

project requirements and testing facilities / tools available.9. Defect Tracking System: Required negotiation between the development and

testing team to fix defects and resolve.10. Change and Configuration Management: required strategies to handle change

requests of customer site.11. Risk Analysis and Mitigations: Common problems appear during testing and

possible solutions to recover.12. Training plan: Need of training for testing to start/conduct/apply.

Page 19: Manual Testing

VIKRAM

Test Factors: 1. Authorization:

Security TestingFunctionality / Requirements Testing

2. Access Control : Security TestingFunctionality / Requirements Testing

3. Audit Trail : Error Handling TestingFunctionality / Requirements Testing

4. Correctness: All black box Testing Techniques

5. Continuity in Processing: Execution TestingOperations Testing

6. Coupling : Inter Systems Testing

7. Ease of Use: User Interface TestingManual Support Testing

8. Ease of Operate: Installation testing

9. File Integrity: Recovery TestingFunctionality / Requirements Testing

10. Reliability: Recovery TestingStress Testing

11. Portable: Compatibility TestingConfiguration Testing

12. Performance: Load TestingStress TestingData Volume TestingStorage Testing

13. Service Levels: Stress TestingFunctionality / Requirements Testing

14. Methodology: Compliance Testing

15. Maintainable: Compliance Testing

Page 20: Manual Testing

VIKRAM

Test Methodology: Test strategy defines over all approach. To convert an over all approach into corresponding project level approach, quality analyst / PM defines test methodology.

Step 1: Collect test strategyStep 2: Project type

Project Type Information Gathering & Analysis

Design Coding System Testing

Maintenance

Traditional Y Y Y Y YOff-the-Shelf X X X Y XMaintenance X X X X Y

Step 3: Determine application type: Depends on application type and requirements the QA decrease number of columns in the TRM.Step 4: Identify risks: Depends on tactical risks, the QA decrease number of factors (rows) in the TRM.Step 5: Determine scope of application: Depends on future requirements / enhancements, QA try to add some of the deleted factors once again. (Number of rows in the TRM)Step 6: Finalize TRM for current projectStep 7: Prepare Test Plan for work allocation.

Testing Process:

PET (Process Experts Tools and Technology): It is an advanced testing process developed by HCL, Chennai. This process is approved by QA forum of India. It is a refinement form of V-Model.

Test Initiation

Test Planning

Test Design Test

Execution

Test Closure

Test Report

DefectRegression Testing

Page 21: Manual Testing

VIKRAM

Initial Build

Information Gathering (BRS)

Analysis (S/wRS)

Design (HLDD & LLDD) PM / QA Test Initiation

Coding

Unit Testing

Integration Testing

Test Lead Test Planning

Study S/wRS & Design Docs

Test Design

Level – 0 (Sanity / Smoke / TAT)

Test Automation

Test Batches Creation

Select a batch and starts execution (Level - 1 )

If u got any mismatch then suspend that Batch

Otherwise

Test Closure

Final Regression / Pre Acceptance / Release / Post Mortum / Level -3 Testing

User Acceptance Test

Sign Off

+

Defect Fixing

Bug Resolving (Regression)

(Level – 2)

Defect

Report

(Modified Build)

Independent

Next

Page 22: Manual Testing

VIKRAM

Test Planning: After completion of test initiation, test plan author concentrates on test plan writing to define “what to test, how to test, when to test and who to test “.

What to test - Development PlanHow to test - S/wRSWhen to test - Design DocumentsWho to test - Team Formation

1. Team FormationIn general test planning process starts with testing team formation, depends on below factors.

Availability of Testers Test Duration Availability of test environment resources

The above three are dependent factors.

Test Duration:

Common market test team duration for various types of projects.

C/S, Web, ERP projects - SAP, VB, JAVA – Small - 3-5 monthsSystem Software - C, C++ - Medium – 7-9 monthsMachine Critical - Prolog, LISP - Big - 12-15 months

System Software Projects: Network, Embedded, Compilers …Machine Critical Software: Robotics, Games, Knowledge base, Satellite, Air Traffic.

2. Identify tactical RisksAfter completion of team formation, test plan author concentrates on risks analysis and mitigations.

1) Lack of knowledge on that domain2) Lack of budget3) Lack of resources (h/w or tools)

Development Plan & S/wRS & Design Documents

TRM

Team Formation

Identify tactical Risks

Prepare Test Plan

Review Test Plan

Test Plan

Page 23: Manual Testing

VIKRAM

4) Lack of testdata (amount)5) Delays in deliveries (server down)6) Lack of development process rigor7) Lack of communication (Ego problems)

3. Prepare Test Plan

Format:

1) Test Plan id: Unique number or name2) Introduction: About Project3) Test items: Modules4) Features to be tested: Responsible modules to test5) Feature not to be tested: Which ones and why not?6) Feature pass/fail criteria: When above feature is pass/fail?7) Suspension criteria: Abnormal situations during above features testing.8) Test environment specifications: Required docs to prepare during testing9) Test environment: Required H/w and S/w10) Testing tasks: what are the necessary tasks to do before starting testing11) Approach: List of Testing Techniques to apply12) Staff and training needs: Names of selected testing Team13) Responsibilities: Work allocation to above selected members14) Schedule: Dates and timings15) Risks and mitigations: Common non technical problems16) Approvals: Signatures of PM/QA and test plan author

4. Review Test Plan

After completion of test plan writing test plan author concentrate on review of that document for completeness and correctness. In this review, selected testers also involved to give feedback. In this reviews meeting, testing team conducts coverage analysis.

S/wRS based coverage (What to test) Risks based coverage (Analyze risks point of view)

TRM based coverage (Whither this plan tests all tests given in TRM)

Test Design: After completion of test plan and required training days, every selected

test engineer concentrate on test designing for responsible modules. In this phase test engineer prepares a list of testcases to conduct defined testing, on responsible modules.

There are three basic methods to prepare testcases to conduct core level testing.

Page 24: Manual Testing

VIKRAM

Business Logic based testcase design Input Domain based testcase design User Interface based testcase design

Business Logic based testcase design: In general test engineers are writing list of testcases depends on usecases / functional specifications in S/wRS. A usecase in S/wRS defines how a user can use a specific functionality in your application.

To prepare testcases depends on usecases we can follow below approach:

Step 1: Collect responsible modules usecasesStep 2: select a usecase and their dependencies (Dependent & Determinant)Step 2-1: identify entry condition Step 2-2: identify input requiredStep 2-3: identify exit conditionStep 2-4: identify output / outcome Step2-5: study normal flowStep 2-6: study alternative flows and exceptionsStep3: prepare list of testcases depends on above studyStep 4: review testcases for completeness and correctness

TestCase Format:

After completion of testcases selection for responsible modules, test engineer prepare an IEEE format for every test condition.

TestCase Id: Unique number or name

BRS

HLDD

LLDD

S/wRS Usecases + Functional

Specifications

Coding .Exe

TestCases

Page 25: Manual Testing

VIKRAM

TestCase Name: Name of the test conditionFeature to be tested: Module / Feature / ServiceTestSuit Id: Parent batch Id’s, in which this case is participating as a member. Priority: Importance of that testcasePo – Basic functionalityP1 – General Functionality (I/p domain, Error handling)P2 – Cosmetic TestCases(Ex: p0 – os, p1-difft oss, p2 – look & feel)Test Environment: Required H/w and S/w to execute the test casesTest Effort: (Person Per Hour or Person / Hr) Time to execute this test case ( 20 Mins ) Test Duration: Date of executionTest Setup: Necessary tasks to do before start this case executionTest Procedure: Step by step procedure to execute this testcase.

Step No. Action I/p Required Expected Result Defect ID Comments

TestCase Pass/Fail Criteria: When that testcase is Pass, When that testcase is fail.

Input Domain based TestCase Design:To prepare functionality and error handling testcases, test engineers are using UseCases or functional specifications in S/wRS. To prepare input domain testcases test engineers are depending on data model of the project (ERD & LLD)

Step1: Identify input attributes in terms of size, type and constraints.(Size- range, type – int, float constraint – Primary key)Step2: Identify critical attributes in that list, which are participating in data retrievals and

Manipulations. Step3: Identify non critical attributes which are input, output type.Step4: Prepare BVA & ECP for every attribute.

ECP ( Type ) BVA ( Size / Range )Input Attribute Valid Invalid Minimum Maximum

Fig: Data Matrix

User Interface based testcase design:

Test Design Test Execution

Page 26: Manual Testing

VIKRAM

To conduct UI testing, test engineer write a list of test cases, depends on our organization level UI rules and global UI conventions.

For preparing this UI testcases they are not studying S/wRS, LLDD etc…Functionality testcases source: S/wRS. I/P domain testcases source: LLDD

Testcases: For all projects applicableTestcase1: Spelling checkingTesecase2: Graphics checking (alignment, font, style, text, size, micro soft 6 rules) Testcase3: Meaningful error messages or not. (Error Handling Testing – related message is coming or not. Here they are testing that message is easy to understand or not)

TestCase4: Accuracy of data displayed (WYSIWYG) (Amount, d o b)

Testcase5: Accuracy of data in the database as a result of user input.(Tc4 screen level, tc5 at database level)

Testcase6: Accuracy of data in the database as a result of external factors?

Testcase7: Meaningful Help messages or not? (First 6 tc for uit and 7 manual support testing)

66.666

Form

Bal 66.7

DSN

Table

Mail + .Gif

Mail ServerImage

DecompressionImage

compression

DS

Import

Mail + .Gif

Page 27: Manual Testing

VIKRAM

Review Testcases: After completion of testcases design with required documentation [IEEE] for responsible modules, testing team along with test lead concentrate on review of testcases for completeness and correctness. In this review testing team conducts coverage analysis

1. Business Requirements based coverage2. UseCases based coverage3. Data Model based coverage4. User Interface based coverage5. TRM based coverage

Fig: Requirements Validation / Traceability Matrix.

Business Requirements Sources (Use Cases, Data Model…) TestCases ****** ***** *

* ***** *

****** *

*

Test Execution:

Test Execution levels Vs Test Cases:Level 0 – P0

Development Site Testing SiteInitial Build

Stable Build

Level-1 (Comprehensive)

Defect Report

Defect Fixing

Bug Resolving

Modified BuildLevel-2 (Regression)

Level-3 (Final Regression)

Test Automation

8-9 Times

Level-0 (Sanity / Smoke / TAT)

Page 28: Manual Testing

VIKRAM

Level 1– P0, P1 and P2 testcases as batches Level 2– Selected P0, P1 and P2 testcases with respect to modificationsLevel 3– Selected P0, P1 and P2 testcases at build.

Test Harness = Test Environment + Test Bed

Build Version Control: Unique numbering system. (FTP or SMTP)

After defect reporting the testing team may receive Modified Build Modified Programs

To maintain this original builds and modified builds, development team use version control softwares.

ServerSoftbase

Build

Test Environment

FTP

Server

Test Environment

21

Modified Programs

Modified Build

Embed into Old Build

Page 29: Manual Testing

VIKRAM

Level 0 (Sanity / Smoke / TAT):

After receiving initial build from development team, testing team install into test environment. After completion of dumping / installation testing team ensure that basic functionality of that build to decide completeness and correctness of test execution.

During this testing, testing team observes below factors on that initial build.

1. Understandable: Functionality is understandable to test engineer.2. Operable: Build is working without runtime errors in test environment.3. Observable: Process completion and continuation in build is estimated by tester.4. Controllable: Able to Start/ Stop processes explicitly.5. Consistent: Stable navigations6. Maintainable: No need of reinstallations7. Simplicity: Short navigations to complete task.8. Automatable: Interfaces supports automation test script creation.

This level-0 testing is also called as Testability or Octangle Testing (bcz based on 8 factors).

Test Automation: After receiving a stable build from development team, testing team concentrate on test automation.

Test Automation two types: Complete and Selective.

Level-1: (Comprehensive Testing): After completion of stable build receiving from development team and automation, testing team starts test execution of their testcases as batches. The test batch is also known as TestSuit or test set. In every batch, base state of one testcase is end state of previous testcase.

During this test batches execution, test engineers prepares test log with three types of entries.

1. Passed2. Failed3. Blocked

Passed: All expected values are equal to actual.

Test Automation

Complete Selective* (All P0 and

carefully selected P1 Testcases)

Page 30: Manual Testing

VIKRAM

Failed: Any expected value is variated with actual.Blocked: Corresponding testcases are failed.

Level-2 Regression Testing: Actually this Regression testing is part of Level-1 testing. During comprehensive test execution, testing team reports mismatches to development team as defects. After receiving that defect, development team performs modifications in coding to resolve that accepted defects. When they release modified build, testing team concentrate on regression testing before conducts remaining comprehensive testing.

Severity: Seriousness of the defect defined by the tester through Severity (Impact and Criticality) importance to do regression testing. In organizations they will be giving three types of severity like High, Medium and Low.

High: Without resolving this mismatch tester is not able to continue remaining testing. (Show stopper).Medium: Able to continue testing, but resolve must.Low: May or may not resolve.

Ex: High: Database not connecting.Medium: Input domain wrong. (Accepting wrong values also)Low: Spelling mistake.

Xyz are three dependent modules. If u find bug in z, then

Do on z and colleges: HighFull z module: MediumPartial z module: Low

In Queue

Skip

Blocked

In Progress

Passed

Failed

Partial Pass / Fail

Closed

Page 31: Manual Testing

VIKRAM

Possible ways to do Regression Testing:

Case 1: If development team resolved bug and its severity is high, testing team will re execute all P0, P1 and carefully selected P2 test cases with respect to that modification.

Case 2: If development team resolved bug and its severity is medium, testing team will re execute all P0, selected P1 [80-90 %] and some of P2 test cases with respect to that modification.

Case 3: If development team resolved bug and its severity is low, testing team will re execute some of the P0, P1, P2 test cases with respect to that modification.

Case 4: If development team performs modifications due to project requirement changes, testing team reexecute all P0 and selected testcases.

Severity: With respect to functionalityPriority: With respect to customer.

Severity: All defects are not with same severity.Priority: All defects are not with same priority.

Resolved Bug

Severity

High Medium Less

All P0All P1

Selected P2

All P0Selected P1

Some P2

Some P0Some P1Some P2

On modified Build to ensure bug resolving

Page 32: Manual Testing

VIKRAM

Severity: Seriousness of the defect.Priority: Importance of the defect.

Severity: Project functionality point of view important.Priority: Customer point of view important.

Defect Reporting and Tracking: During comprehensive test execution, test engineers are reporting mismatches to development team as defect reports in IEEE format.

1. Defect Id: A unique number or name.2. Defect Description: Summary of defect.3. Build Version Id: Parent build version number.4. Feature: Module / Functionality5. Testcase name and Description: Failed testcase name with description6. Reproducible: (Yes / No)7. If yes, attach test procedure.8. If No, attach snapshots and strong reasons9. Severity: High / Medium / Low10. Priority11. Status: New / Reopen (after 3 times write new programs)12. Reported by: Name of the test engineer13. Reported on: Date of Submission14. Suggested fix: optional15. Assign to: Name of PM16. Fixed by: PM or Team Lead17. Resolved by: Name of the Developer18. Resolved on: Date of solving19. Resolution type: 20. Approved by: Signature of the PM

Defect Age: The time gap between resolved on and reported on.Defect Submission:

Page 33: Manual Testing

VIKRAM

Fig: Large Scale Organizations.

Defect Submission:

Fig: Small Scale Organizations.

Test Manager

Test Lead

Test Engineer

Project Manager

Team Lead

Developers

QA

Transmittal Reports

Test Lead

Test Engineer

Project Manager

Team Lead

Developers

Transmittal Reports

Page 34: Manual Testing

VIKRAM

Defect Status Cycle:

Bug Life Cycle:

Resolution Type:

There are 12 resolution types such as1. Duplicate: Rejected due to defect like same as previous reported defect.2. Enhancement: Rejected due to defect related to future requirement of the

customer.3. H/w Limitation: Raised due to limitations of hardware (Rejected)4. S/w Limitation: Rejected due to limitation of s/w technology.

New

Fixed (Open, Reject, Deferred)

Closed

Reopen

Detect Defect

Reproduce Defect

Report Defect

Fix Bug

Resolve Bug

Close Bug

Page 35: Manual Testing

VIKRAM

5. Functions as design: Rejected due to coding is correct with respect to design documents.

6. Not Applicable: Rejected due to lack of correctness in defect.7. No plan to fix it: Postponed part timely (Not accepted and rejected)8. Need for More Information: Developers want more information to fix. (Not

accepted and rejected)9. Not Reproducible: Developer want more information due to the problem is not

reproducible. (Not accepted and rejected)10. User misunderstanding: (Both argues you r thinking wrong) (Extra negotiation

between tester and developer)11. Fixed: Opened a bug to resolve (Accepted)12. Fixed Indirectly: Differed to resolve (Accepted)

Types of Bugs:

UI bugs: (Low severity)Spelling mistake: High PriorityWrong alignment: Low Priority

Input Domain bugs: (Medium severity)Object not taking Expected values: High PriorityObject taking Unexpected values: Low Priority

Error Handling bugs: (Medium severity)Error message is not coming: High PriorityError message is coming but not understandable: Low Priority

Calculation bugs: (High severity)Intermediate Results Failure: High PriorityFinal outputs are Wrong: Low Priority

Service Levels bugs: (High severity)Deadlock: High PriorityImproper order of Services: Low Priority

Load condition bugs: (High severity)Memory leakage under load: High PriorityDoesn't allow customer expected load: Low Priority

Hardware bugs: (High severity)Printer not connecting: High PriorityInvalid printout: Low Priority

Id control bugs: (Medium severity) Wrong version no, LogoVersion Control bugs: (Medium severity)

Page 36: Manual Testing

VIKRAM

Source bugs: (Medium severity) Mismatch in help documents

Test Closure:After completion of all possible testcase execution and their defect reporting and tracking, test lead conduct test execution closure review along with test engineers.

In this review test lead depends on coverage analysis:

BRS based coverage UseCases based coverage (Modules) Data Model based coverage (i/p and op) UI based coverage (Rules and Regulations) TRM based coverage (PM specified tests are covered or not)

Testing team try to execute the high priority test cases once again to confirm correctness of master build.

Final Regression Process: Gather requirements Effort estimation (Person/Hr)Plan RegressionExecute RegressionReport Regression

User Acceptance Testing:After completion of test execution closure review and final regression, our organization concentrates on UAT to collect feed back from customer / customer site like people.There are two approaches:

1. Alpha testing2. Beta testing

SignOff:After completion of UA and then modifications, test lead creates Test Summary Report (TSR). It is a part of s/w release note. This TSR consists of

1. Test Strategy / Methodology (what tests)2. System Test Plan (schedule)3. Traceability Matrix (mapping requirements and testcases)4. Automated Test Scripts (TSL + GUI map entries)

5. Final Bug summary Report

Bug IdDescription

Found By

Status(Closed / Deferred)

Severity Module / Functionality

CommeNts