58
1 Contents Software................................................................................................................................................. 6 System software ............................................................................................................................... 6 Application Software ........................................................................................................................ 6 Product based and Project based.............................................................................................. 6 Software Testing................................................................................................................................... 6 Software Testing Methods ...................................................................................................................... 7 Manual Testing: .................................................................................................................................. 7 Automation Testing ............................................................................................................................ 7 Roles and Responsibilities of a Functionality Test Engineer: .................................................................. 7 SDLC [system development life cycle] .................................................................................................... 8 Common problems in SDLC:- ............................................................................................................ 11 SDLC Models:- ....................................................................................................................................... 12 Waterfall Model: - ............................................................................................................................. 12 Prototype Model:-............................................................................................................................. 13 Incremental Model:- ......................................................................................................................... 14 RAD Model:- ...................................................................................................................................... 15 Fish Model:- ...................................................................................................................................... 16 Static Testing: - .............................................................................................................................. 16 Dynamic Testing:-.......................................................................................................................... 16 Verification:- ................................................................................................................................. 16 Validation:- .................................................................................................................................... 17 V-Model: - ......................................................................................................................................... 17 Agile Model:- ..................................................................................................................................... 18 Quality management:- .......................................................................................................................... 19 Quality Assurance ............................................................................................................................. 19 Quality Control:-................................................................................................................................ 19 Testing Approaches:- ............................................................................................................................ 20 Exhaustive testing:- ........................................................................................................................... 20 Optimal Testing:-............................................................................................................................... 20 Testing Methodologies ......................................................................................................................... 20 White Box testing:- ........................................................................................................................... 20

Manual testing materia with content pdf

Embed Size (px)

Citation preview

Page 1: Manual testing materia with content pdf

1

Contents Software ................................................................................................................................................. 6

System software ............................................................................................................................... 6

Application Software ........................................................................................................................ 6

Product based and Project based.............................................................................................. 6

Software Testing................................................................................................................................... 6

Software Testing Methods ...................................................................................................................... 7

Manual Testing: .................................................................................................................................. 7

Automation Testing ............................................................................................................................ 7

Roles and Responsibilities of a Functionality Test Engineer: .................................................................. 7

SDLC [system development life cycle] .................................................................................................... 8

Common problems in SDLC:- ............................................................................................................ 11

SDLC Models:- ....................................................................................................................................... 12

Waterfall Model: - ............................................................................................................................. 12

Prototype Model:-............................................................................................................................. 13

Incremental Model:- ......................................................................................................................... 14

RAD Model:- ...................................................................................................................................... 15

Fish Model:- ...................................................................................................................................... 16

Static Testing: - .............................................................................................................................. 16

Dynamic Testing:-.......................................................................................................................... 16

Verification:- ................................................................................................................................. 16

Validation:- .................................................................................................................................... 17

V-Model: - ......................................................................................................................................... 17

Agile Model:- ..................................................................................................................................... 18

Quality management:- .......................................................................................................................... 19

Quality Assurance ............................................................................................................................. 19

Quality Control:-................................................................................................................................ 19

Testing Approaches:- ............................................................................................................................ 20

Exhaustive testing:- ........................................................................................................................... 20

Optimal Testing:- ............................................................................................................................... 20

Testing Methodologies ......................................................................................................................... 20

White Box testing:- ........................................................................................................................... 20

Page 2: Manual testing materia with content pdf

2

Black Box Testing:- ............................................................................................................................ 20

Gray Box Testing:- ............................................................................................................................. 21

Testing levels in SDLC:- ......................................................................................................................... 21

Requirements Review:- ..................................................................................................................... 21

Design Review:- ................................................................................................................................. 21

Unit Testing:- ..................................................................................................................................... 22

Integration testing:- .......................................................................................................................... 23

Big Bang Theory:- .......................................................................................................................... 23

Incremental Approach:- ................................................................................................................ 23

System Testing:- .................................................................................................................................... 26

Functionality Testing:-....................................................................................................................... 26

Object properties coverage:- ........................................................................................................ 27

Error Handling Coverage:-............................................................................................................ 27

I/P Domain coverage:- .................................................................................................................. 27

Calculation coverage:- ................................................................................................................... 30

Database Coverage:- ..................................................................................................................... 30

Links coverage:-............................................................................................................................. 31

Non – functionality Testing:- ............................................................................................................. 31

Graphical User Interface testing:- ................................................................................................. 31

Usability Testing:- .......................................................................................................................... 32

Performance testing: - .................................................................................................................. 32

Security testing:- ........................................................................................................................... 33

Recovery testing: - ..................................................................................................................... 34

Portability testing/compatibility testing:- .................................................................................. 34

Configuration testing: - .............................................................................................................. 34

Installation testing: - ................................................................................................................... 34

Sanitation testing / Garbage testing:- ...................................................................................... 35

Comparative testing/ parallel testing: - .................................................................................... 35

User acceptance testing: - ................................................................................................................ 35

Alpha testing: - ................................................................................................................................ 35

Beta testing: - .................................................................................................................................. 35

Some Other Testing Activities: - ...................................................................................................... 37

Sanity testing (build stable or not): - ............................................................................................ 37

Ad-hoc testing: - ............................................................................................................................. 37

Page 3: Manual testing materia with content pdf

3

Exploratory testing: - ...................................................................................................................... 37

Jump / Monkey: - ............................................................................................................................ 38

Localization testing:- ...................................................................................................................... 38

Internationalization testing: - ......................................................................................................... 38

Mutation testing (senior developer): -.......................................................................................... 38

Re- testing: - .................................................................................................................................... 38

Regression testing: - ...................................................................................................................... 38

Maintenance testing:- .................................................................................................................... 39

Software Testing Life Cycle .............................................................................................................. 40

Test initiation: - ............................................................................................................................... 40

Test planning:- ................................................................................................................................ 41

Components in Test Plan Document:- ........................................................................................ 41

Test Case Design:- ........................................................................................................................ 44

Inputs to Derive Test cases:- .................................................................................................... 45

Fundamentals of Test Cases:- ................................................................................................. 46

Types of test cases:- .................................................................................................................. 46

Test Case Template:- ................................................................................................................ 47

Test Cases Review:- .................................................................................................................. 51

Requirement Traceability Matrix:- ............................................................................................ 51

Test Execution:- .............................................................................................................................. 51

Test Set:- ..................................................................................................................................... 52

Test Execution Flow:- ................................................................................................................ 52

Level-o: Sanity/Smoke Test:- ................................................................................................... 52

Level-1: Real Testing/Comprehensive Testing:- ................................................................... 53

Level 2:Re& regression testing :- ............................................................................................. 53

Level3: Final Regression Testing:- ......................................................................................... 53

Defect Reporting:- .......................................................................................................................... 54

Defect Template:- ....................................................................................................................... 54

Defect life cycle: - ....................................................................................................................... 55

Test closure: - ................................................................................................................................. 57

Test cases for 1litere bottle. ............................................................................................................. 57

Test cases for 4 GB Pen drive ......................................................................................................... 57

Test cases for a Lift which allows maximum 6 persons ............................................................... 58

Page 4: Manual testing materia with content pdf

4

Page 5: Manual testing materia with content pdf

5

Page 6: Manual testing materia with content pdf

6

Software

Set of instructions, programs document to perform the specific task.

Software can be categorized into two types

1. System software

2. Application software

System software

These soft wares are used to provide interface between the system components.

Ex: - All OS’s like windows, Linux,…etc.

Application Software

These are developed based on client business needs in order to perform their

activities

In App s/w again divided into two types

Product based and Project based

a) Product based: - When we develop the application based on standard

requirements and that can be sell to any customer in the market

Ex: - Ms Word is a product of Microsoft,

QTP is a product of HP

b) Project based:-When we develop the application based on the specific client

requirements and that should be delivered that particular client only.

Ex:-www.constanza.com is a project for “constanza” client.

www.qaramu.com is a project for “Ramakrishna”

Software Testing It is a process to identify correctness, completeness and

measure quality of developed Software Application.

S/W Testing will help to deliver reliable Application to the customer and it will

reduce maintenance cost of project.

Objective of Software testing is to identify defects, when change defects

quality improves

Defect: - It is a deviation b/w expected result to the actual result in Application under

test

i) Defect can also be called bug or issue

Correctness: - Variable implemented functionality is working as per expectations or

not by performing operations

Page 7: Manual testing materia with content pdf

7

Completeness:- Verify all the client business needs or requirements are covered or

not in terms of the functionalities in this Application

In a simple S/W testing is a combination of verification and validation

Software Quality: - As per produces products (Development team) when App fulfilled

client with all the client requirements when he will say it is a qualify Application

As per client or customer and user point of view when App fit for use, according to

his business needs then we will say that is a quality Application.

Following are the major factors depends software quality:

1) Budget or cost 2) In time release 3) Reliability

i) Meet client requirements in terms

of functionality

ii) Meet client expectations

Software Testing Methods In general organization follows 2 types of methods to validate application

1). Manual testing 2). Automation testing

Manual Testing: Without using any automation tools test engineer will verify actual

behaviour of App by performing the operations or transactions. For manual testing

test engineer will prepare test cases which are used to validate Applications

Advantage: - simple & easy

Disadvantages:-

Time consuming [more time taken to execute]

Human errors

Automation Testing: Automating the human activities [Test cases execution] in

order to validate Applications is called Automation Testing

Advantages: - 1) less time to execute 2) There is no possibility of human errors

Disadvantages: - 1) Automation tools are expensive 2) Skilled automation test

engineers are required

Roles and Responsibilities of a Functionality Test Engineer: 1) Analyse the client requirements [BRS& FRS]

2) Identify the test Scenario’s 3) Prepare Test cases

Page 8: Manual testing materia with content pdf

8

4) Develop Automation script

5) Review on Test cases & Automation Test script

6) Execution of Test cases and script

7) Defect Reporting

8) Retesting & regression testing

SDLC [system development life cycle] It describes development process of a software project or product to fulfill

the client requirement with in the specified cost & time

Following are the phases involved in SDLC

1) Requirements collection

2) Requirement Analysis

3) Design

4) Coding

5) Testing

6) Release & maintenance

B. A Requirement collection BRS | URS | CRS |

Budget

B.A Feasibility study In Time

Reliability

SA Requirement Analysis FRS\ SRS

GUI Design doc

DA Design Data Base design doc

App design doc

Dev Coding S.C.Document

Developer Unit and Integration Testing (W.B.T)

QA Tester Testing System Testing (B.B.T)

Client User Acceptance Testing

Release and Maintenance

Page 9: Manual testing materia with content pdf

9

Here:-

BA ………….. Business analyst

SA …………… system Analyst

BRS ……………. Business requirement specification

DA …………… Design Architect

URS …………… User requirement specification

GUI …………. Graphical user interface

CRS …………… Client requirement specification

ADD/TDD ………..…. Application / technical design document

SRS/FRS ………….. System/Functional requirement specification

SCD ………….. Source code document

HLD ………….. High level design doc

LLD …………… Low level design doc

Requirements collection: - It is an initial activity performed in SDLC

In this phase B.A will collect requirements with an interaction of client and

collected requirements will be documented as BRS\URS\CRS

BRS: - It describes brief description of core logic of client business needs in terms of

who are the user for the Application & required services for those users in that

application

EX: - Online Banking

Admin

Banker

Customer

After preparation of BRS doc they perform feasibility study to

check project is acceptable or not in order to develop where as BA will play a key

role in feasibility study

Following are the factors analysed in feasibility study:-

1) Budget feasibility

2) Time feasibility

3) Requirements are reliable or not in terms of technology to develop

Page 10: Manual testing materia with content pdf

10

After feasibility study if project is acceptable then business analyst will

intimate to the client by releasing RFP and SLA documents.

Requirement Analysis:-

In the phase system analyst will analyse client business needs from BRS

based on that he will prepare detailed document called FRS/ SRS

FRS describes detailed functionality of each component like which date it

should accept & how the component should work

Design: - In this phase design Architect will design application architecture to fulfil

the client requirements which are specified at FRS. In this phase following are the

doc prepared by DA

1) GUI design doc

2) Data base Design Doc

3) Application Design Doc / TDD

i) GUI Design Doc :- It contains prototypes of an application

Prototype will help to foresee the future implementation

of an application & better understandability of

functionalities

Proto type is a sample application without functionality (

dummy screens )

Prototypes are not mandatory for all projects

ii) Data base Design Doc :- It describes about database structure and

application in terms of no of tables relation b/w those tables & rules

implemented in database

iii) ADD/ Technical Design doc :- it contains 2 types of sub doc’s 1)

HLD 2) LLD

HLD: - It describes no of modules required for a project & relation

b/w those modules

Ex:- For a Hospital management system may contain following

modules.

HMS

Front

Desk

Regist

ration Billing

Pharmacy

Clinical

chart

Patient

chart

Page 11: Manual testing materia with content pdf

11

Modularization: splitting the projects with set of modules for easy development of

application is called Modularization

Module: It is a some portion of application which have ssthe set of similar

requirements of functionalities

LLD: - There should be individual LLD for each module in order to develop the logic

while writing the programs

Note: - Design docs are important for developers in order to write the programs

Coding / Implementation: - In this phase developers will write the program using

programming language (for windows based application) or scripting languages (for

web based application)

O/p of this phase is source code document (s.c.doc)

Testing:- After completion of coding programs are available for execution. Initially

developers will perform unit test & integration testing using W.B.T (white box testing

techniques)

After that separate testing team will perform system testing using B.B.T (Black Box

Testing)

Then client also performs user acceptance testing

Release & Maintenance:-

Release: - After system testing & creating user acceptance testing on our work

product then we deliver application to the client for further use at live environment is

called “Release or Go live or Production

Maintenance: - while using the application client can identify some defects or he may

require some other functionality in the existing system then he will send change

request (C.R) to the development team

Based on initial agreement (SLA), CCB (change control board) will work on

change request

Common problems in SDLC:-

1) Poor Requirements:- when requirements are incomplete & not clear to

understand that will be a problem to develop the application

2) Unrealistic schedule :- If too much of work is crammed/ assigned in too little

time that will be a problem

3) Inadequate testing/ Incomplete testing :-

In present scenario it is difficult to estimate how much testing is sufficient

to validate application

Page 12: Manual testing materia with content pdf

12

4) Dynamic changes in requirement :- When client continuously sending

changes in requirements then that will be a problem

5) Miscommunication: - lack of communication b/w Developers , clients , DA ,

SA..Etc.

Defect repair cost with respect to SDLC phases:-

SDLC phases DRC

Requirement 0%

Design 10%

Coding 30%

Testing 50%

Maintenance 100%

Note: - Early stages identify the defects will take less cost to resolve compare to later

stages identified defects.

Note: - software testing will help to reduce maintenance cost for a project

SDLC Models:- Based on need of customer & complexity of requirement we can

adopt the anyone of the following SDLC models to develop the application

1) Water fall Model

2) Prototype Model

3) Incremental model

4) Spiral Model

5) RAD Model

6) Fish model

7) V model

8) Agile Model

Waterfall Model: -

This model is also called as linear sequential waterfall model

This model is preferable for “small” projects which have “clear” requirements

In this model entire system or application developed in sequential i.e. If one

phase is completed, then they start next phase

Page 13: Manual testing materia with content pdf

13

Waterfall Model

Advantages: - It is easy & simple to develop application

Disadvantages; - This model is not suitable for dynamic changes in the requirement

And early stages identifying defects is not possible.

Prototype Model:-

Prototype:- It is a sample application without functionality(Dummy screens)

If Client not Demo to

satisfy Client

If client satisfied

Requirements Collection

Requirements Analysis

Design

Coding

Testing

Release & Maintenance

Requirements Collection

Requirements Analysis

Design

Develop Prototypes

Client Evaluation

Collect Feedback

Refine Requirements

Coding

Testing

Release&Maintaince

Page 14: Manual testing materia with content pdf

14

Prototype model is preferable when requirements are not clear or in complete.

In this model based on initial requirements we develop prototype which are

given presentation to the customer based on client satisfaction we develop

the application.

Advantages: - In this model we can get complete requirements from client before

actual coding using prototype.

Disadvantage:-It is not reusable & developing prototype is an additional activity for

developers.

Incremental Model:-

This Model is preferable for big projects which have the risk to develop

Sequential approach.

In this approach application is divided into modules & then they developed

and tested and delivered to the client module by module.

Whereas the first module are implemented should be a core functionality

of the system

Requirements For Notepad:-

1.0 File Menu

11.0 Basic File Menu option

1.1.1Page setup and Print

2.0 Edit Menu

2.1.0 Basic Edit Menu Option

2.1.1 Find Replace Under

3.0 View menu

4.0.0 Format

5.0 Help.

Advantages:-

Risk management is easy.

Early stages of application implementation and deliverables are possible.

Disadvantages:-

It is an open ended SDLC Model there is no fixed deadlines to complete

whole project.

1.1.0

2.1.0 D C T R

1.1.1

2.1.1

D C T R

3.0

4.0

D C T R

Page 15: Manual testing materia with content pdf

15

Spiral Model:-

This model is preferable for which projects successful implementation,

estimations are not possible at early stages.

It is used for research applications and machine criticality applications

Spiral Model

In this approach application developed like in incremental model

whereas early stages deliverable are not possible

Advantages:-

Less possibility defects in the final product.

Less resource is required to develop the application

Disadvantages:-

Time consuming process(5 to 10 years)

RAD Model:-

This model is suitable when client required whole system (or) application in an

extremely short span of time like 60-90 days.

In this approach we use already existing predefined functionality components

from other environment.

Advantages:-

Possibility to deliver application in short time of span.

Disadvantages:-

Required predefined components may not available

M1 R.A

D C

T

M2 R.A

D

C

T

M3

R.A

D

Page 16: Manual testing materia with content pdf

16

Application environment may not supported existing predefined projects.

Fish Model:-

It is not a practical approach to develop the applications

It describes theoretical mapping b/w development activities to testing

activities.

R.A Design Coding R&M

Sy.T

R.c{B.R.S} {F.R.S} {T.D.D} {S.C.D}

Review Review W.B.T B.B.T C.R

Static Testing: - It is a process of finding mistakes without execution of programs.

Dynamic Testing:-It is the process of finding mistakes with execution times of

programs (or) app’s.

Verification:-It is the process to check are we developing product right or not.

It is a static testing approach.

In General verification performed on requirements documents {B.R.S and

F.R.S}, Design Doc,Source code, test plan doc, test cases……etc.

Following are the verification techniques:-

Peer Review

walkthrough

Inspection

Peer Review:-

It is an informal meeting

Author will deliver document to reviewer to identify any mistakes

Walkthrough:-

Page 17: Manual testing materia with content pdf

17

It is a semi informal meeting with 2 to 3 members

In this meeting author will explain about document to reviewers

Inspection:-

It is a formalized meeting with 5 to 6 members

In this meeting following members involved.

Author: - Owner of the document

Reviewer/Inspector: - The person responsible to identify mistakes

Presenter: - Who explains about the document to reviewers.

Recorder/Scribe: - Who will prepare meeting note.

Moderator: - Manager of the meeting.

Validation:-

It is a process to check developed product right or not?

In general it is performed on application source code and functionalities.

Following are the Validation Techniques:-

Unit testing

Integration testing

Functionality testing…..etc.

V-Model: - V stands for verification & validation.

Requirements Collection

Requirements Analysis

H.L.D

L.L.D

Coding

Unit Test

Integration Test

t

System Testing

User Acceptance Testing

Page 18: Manual testing materia with content pdf

18

This model is suitable for big projects which have the clear requirements

In general organizations prefer V-Model to develop application

In this V-Model testing activity starts early stages with development

activities.

Meeting point in v-model is “coding” on source code we can perform

verification and validation.

Advantages:- In this model early stages identifying mistakes is possible.

Disadvantages:-In this model app developed in sequential approach due to that it is

not suitable to handle dynamic changes in the requirement.

Agile Model:-

In this model client closely associated with development team.

In this model application developed like in incremental modelwhereas after

implementation of each feature (or) unit or component that will be delivered

testing team and client time to get approval before we plan for next

component (or) unit.

In this model within a week (or) 2 weeks we expect deliverables to client

Collect feedback

Client Requirements

Unit/component

Implemented

Testing UAT

Approved then plan

for next

Unit/component

Testing UAT

Page 19: Manual testing materia with content pdf

19

Advantages:-

This model is suitable to handle dynamic changes in the requirement.

We can deliver reliable application to the customer

Disadvantages:-

Planning the different team activity is difficult to task.

Quality management:- It is a Process of preventing the defects while developing the application & to ensure

there are no defects in the final product.

Following are the terms involved in Quality management

Quality Assurance

Quality control

Quality Assurance

This team involves throughout SDLC to define process, monitor strength of

development process& to provide suggestions to improve the development process.

In this team management persons involved like project manager, Test manager,

domain experts…….etc.

Quality Control:-

This team is responsible to identify any defects in final product and to solve those

defects before deliver application to the customer

Difference b/w QA &QC-

QA QC

------ ------

1. It is a process oriented 1. It is a product oriented

2. It involves throughout the SDLC 2.It involves after product is developed

3. It is a defects preventive approach 3.It is a defects detective approach

4. Reviews are the QA activities 4.S/W testing is example of QC

. activity

Page 20: Manual testing materia with content pdf

20

Testing Approaches:- These are 2 types of approaches to validate app.

Exhaustive testing:-

Testing the app with all possible combinations is called

“Exhaustive testing”

In present scenario exhaustive testing is not possible

Optimal Testing:-

Testing the app with best possible combinations is called “optimal

testing”

In general we prefer optimal testing only

Testing Methodologies:-

To derive best possible test cases till achieve the completeness of testing in app

source code& functionally we testing methodologies.

There are 3 types of methodologies

1. White Box testing

2. Black Box testing (BBT)

3. Gray Box testing

White Box testing:-

It is also called as “Glass Box Testing or Open/Clear testing or

Structural testing”

WBT Techniques are used to validate source code of an app using

programming knowledge

In general developers are used to following WBT techniques to

derive test cases for code coverage.

1) Statements coverage 2) Path (or) branch coverage

Black Box Testing:-

It also called “Closed Box testing”

Without any programming knowledge to validate app functionality using requirement knowledge we use BBT Techniques

Following are the BBT Techniques we use to derive test cases

Boundary value Analysis Equivalence class partition

Page 21: Manual testing materia with content pdf

21

WBT BBT

It is a design & structural > it is specification based testing

Based testing It is internal logic driven testing > It is business transaction

Driven testing It is for code coverage > It is for requirements coverage

Note: - lack of objective developers will validate only application source code they are not responsible for requirements coverage, due to that organisation maintain separate testing team for requirements coverage

Gray Box Testing:-

it is a combination of WBT & BBT. To perform GBT the person should have programming knowledge&

complete requirements knowledge It can be developed by developers & test engineers It not provide complete code coverage & requirements coverage due to that

it’s not preferable

Testing levels in SDLC:-

1) Requirements review 2) Design review 3) Unit testing 4) Integration testing 5) System testing 6) User acceptance testing

Requirements Review:-

After preparation of required documents review will be conducted to analyse following factors a) All the client requirements covered (or) not b) Recorded information is correct (or) not c) Requirements are reliable (or)not to develop the app d) Requirement are understandable (or)not

Design Review:-

After preparation of design document review will be conducted to analyse following factors. 1) All the requirements covered (or) not in design 2) Design logic is correct or not

Page 22: Manual testing materia with content pdf

22

3) Design logic is understandable (or) not NOTE;- For requirements review & Design review they use verification techniques like peer review, walkthrough & Inspection.

Unit Testing:-

It is an initial validation testing technique. Unit testing also called as “component/module testing”. After coding computer programs are available for execution In order to

validate, “Testing on individual component (or)unit within the app is called “Unit Testing””

It is performed by developers by using WBT techniques Following are the factors they validate in the Unit Testing:-

1).Statements Coverage;- Verify all the statements are participating (or) not at least once during runtime and right techniques are used or not to avoid repeatable statements in the program. Ex:- User defined functions, Sub programs…… 2).Iterative or loop Statements Coverage:- Verify loop statements are terminating as per expectations (or) not Ex;-For loop, do while…….etc 3). Conditional statement coverage;- Verify we use right conditions inconditional statements or not

Ex;- if condition ,nested if,switch command…….etc.

4).Path/branch coverage:- Verify execution flow of the program based on condition true or false

5).Cyclomatic complexity:-

It is a metric to measure no of independent paths required to execute a program

No of paths=2(number of conditions)-1

Page 23: Manual testing materia with content pdf

23

Integration testing:-

After unit testing in order to the system (or) app developers will connect

those individual modules or units or components “Verify interface (or) data communication or functionally b/w units (or)

components (or) modules is called Integration testing.” In general, all the module implementation at same time may not possible

whereas based on modules availability we using following approaches in integration testing.

There are two types

1) Big Bang Theory 2) Incremental Approach

Big Bang Theory:-

This approach says when some of modules under development process then we need to wait to until those modules are developed in order to perform “Integration Testing”

Drawbacks:- In this approach resource will be idle due to that in time release may not be possible

Incremental Approach:-

This approach says based on availability of modules we can perform level by

level integration testing In this we use temporary programs like stub& driver when some of module

under development process. Following are the approaches we use in Incremental integration testing;-

1) TOP-Down Approach 2) Bottom-up approach 3) Hybrid/sandwich approach

Top-down Approach:-

In this approach we perform integration testing from main module to

available sub modules when some of sub modules under development process

Page 24: Manual testing materia with content pdf

24

In This approach we use stub

Top down Approach

Stub:-

It is a temporary program implemented by developers ,Itis used when sub module

under development process.

Bottom up Approach:-

In this approach we will perform integration testing from sub module to main

module when main module under development process.

Main

Sub1

Main

Stub

Main

Sub1 Sub2

Driver

Page 25: Manual testing materia with content pdf

25

Driver:-

It is a temporary program implemented by developers; it is used to main module

under development process.

In provides connection to sub modules.

DRIVER

-----------

1) It is a temporary program

2) When main module under development process then driver is used

3) It provide connection to the sub modules

4) Driver is a calling program

Stub

-------

1) It is also temporary program.

2) When sub module under development process then we use stub

3) It provide temporary result to main module

4) It is a called program

Hybrid/Sandwiched Approach:-

When main module & some of sub modules under development process

then we verify available modules interface, we use stub & driver.

Sub1 Sub2

Driver

Main

Sub3 Sub4

Stub

Page 26: Manual testing materia with content pdf

26

There are 2-levels of Integration testing

Low –level Integrating Testing:-

When we verify interference with in the system (or) application component

is called “low-level integration testing”.

High-level Integration Testing:-

When we verify interference b/w different app’s is called “High-level

Integration Testing”.

Ex;- verify transactions in ICICI bank ATM centre using HDFC Bank Debit

card.

Note: - Integration testing can perform by developers and test engineers.

Developers will perform integration testing after unit testing by WBT

Test engineer will perform integration testing during system testing using BBT

technologies

System Testing:- After unit test and Integration testing development team release build to the separate

team

Build means set of integrated modules in excusable form(.exe files)

After receiving initial stable build from developers test engines will perform

system testing

Def.:- “Validate whole system based on client requirements and expectations is

called “System Testing”

Following are the testing techniques are used in system Testing;-

1) Functionality Testing

2) Non- functionality testing

Functionality Testing:-

It is also called as “requirements testing” In this test we validate application

functionality as per client business needs.

Any type app’s functionality testing is mandatory, due to that organization

maintains separate functionality testing team.

Page 27: Manual testing materia with content pdf

27

Functionality testing can be performed manually (or) using some of the following

functionality testing tool.

Q.T.P -> H.P (2002)

Win Runner -> H.P (1994)

SILK -> BOARLAND

Rational robot -> IBM

Selenium -> Open source tool (true tool)

Following are factors to validate in functionality testing:-

Object properties coverage:-

Verify application object properties are changing or not based on operations

performed on application.

EXS:-enable,disable,items count….etc.

Error Handling Coverage:-

Verify app behaviour when we performed invalid operations

EX:-

Application should provide Error messages, warning messages, pop-

up windows…..etc.

Verify login window providing any error messages when we

performed invalid operations.

I/P Domain coverage:-

Verify I/P objects are accepting customer expected data (or) not like edit (or)text

boxes

Ex;- verify I/P objects are accepting customer expected data (or) not like edit (or)

text boxes

Ex;- Verify “name” edit box should allow Alphanumeric from 4-16 characters.

Note:-Test engineer is responsible to drive test data in order to validate application

To derive best possible test data we using following BBT techniques

1) Boundary value Analysis (BVA)

Page 28: Manual testing materia with content pdf

28

2) Equivalence class partition (ECP)

Boundary value Analysis (BVA):-

This technique is used to validate i/p condition in terms of range (or) size

This technique says there will be 3 possible conditions to validate, possible

conditions are

Note:-

In general there will be high possibility of error in and around the boundary

condition, those we can identify with these techniques.

When boundary conditions are correct for a i\p object then it will accept within

the specified range\size.

Ex:-Prepare text data using BVA for a “name” text box which should allow 4-16

characters.

BVA(Size)

Min ………… 4 = Valid Max ………… 16 = Valid Min-1 ………… 3 = Invalid Max-1 ………… 15 = Valid Min+1 ………… 5 = Valid Max+1 ………….. 17 = Invalid

Ex:-Prepare text data for a “Mobile No “edit box which should allow only 10

digit number where as starting digit should be ‘9’.

Minimum Maximum

Minimum-1 Maximum-1

Minimum + 1 Maximum +1

Page 29: Manual testing materia with content pdf

29

BVA(Size)

Min ………… 9000000000 = Valid Max ………… 9999999999 = Valid Min-1 ………… 8999999999 = Invalid Max-1 ………… 9999999998 = Valid Min+1 ………… 9000000001 = Valid Max+1 ………….. 1ooooooooo = Invalid

Disadvantages;-

With this technique it is not possible to validate for data type for i/p object

like alphabet, numeric, special characters…..etc.

With this technique we can validate i/p condition with only boundary values

& their adjacent values (i.e. middle values are not possible to test)

Equivalence class partition (ECP):-

This technique is used to validate i/p condition in terms of data types.

This technique says divide test data into equivalence classes & take some

Sample data from each class to validate i/p condition.

Classes are

Valid

Invalid

Ex;- Prepare test data to validate withdraw condition for ATM which allows from 100

to 20,000 where amount should multiples of 100.

Valid Invalid

100<=Amount<=20,000/- {Amount should be multiples of 100}

Amount<100,Amount>20,000/- {Amount which is not multiples of 100}

Page 30: Manual testing materia with content pdf

30

EX:-Prepare test data to validate “Name” edit box which should allow alphanumeric.

ECP(Data Type)

Valid Invalid

a-z A-Z 0-9

All special characters

Verify “Name” edit box valid test data {combination of valid size from BVA &

valid data type from ECP}

Verify name Edit with invalid test data {combination of invalid size from BVA

& valid data type from ECP}

Ex:- m09bnsbf45645 Ex:- fdcxbn@345(Invalid)

Calculation coverage:-

“Verify application generates expected o/p values (or) not based on given i/p data”.

Ex:-Verify calculation coverage on fight reservation app for 5th order no record.

Database Coverage:-

Application Front End

S.No Emp. name

Emp. Dept

Date of Jng

Emp.Reg Emp.Name : Emp.Dept : Date of Jng :

Submit Delete Update

Data Base

Page 31: Manual testing materia with content pdf

31

Verify database content with respect to operations performed on app front

end like submit/INSERT/update & delete operations

In this test we also verify the relation b/w frontend objects to database

fields.

Note:-

To find database structure for an app we refer database design

document

To validate database content we should get permission from DBA.

Links coverage:-

Verify implemented links are working correct or not in web based apps like

image links, test links, broken links……etc.

Non – functionality Testing:-

Based on customer Exceptions we validate non functionality factors using following

testing techniques.

Graphical User Interface testing.

Usability testing

Performance testing

Security testing

Recovery testing

Compatibility testing

Configuration testing

Installation testing

Sanitation testing

Comparative testing

Graphical User Interface testing:-

In this test to verify factors which are related to look & feel of an application

Following are factors related to user interface

a) Availability of components

b) Font size or font style

c) Controls spelling

d) Controls visibility

e) Back ground colour of screen

f) Clarity of image objects like logos, graphs….etc.

Page 32: Manual testing materia with content pdf

32

Usability Testing:-

“In this test to verify user friendliness of an application to perform operations

(ease of use)“

Following are factors related to usability test:-

a) Error messages are meaningful or not

b) Function keys implementation

c) Combination keys implementation (ctrl+p)

d) Short navigations

e) Tab implementations

Performance testing: -

After functionality testing separate testing team will conduct PT. In general

organization maintain separate PT team where they do PT

Using following performance testing tools

Load runner

Silk performer

Web load

Rational robot performer

Following are the techniques are used in performance testing

a) Load / Scalability

b) Stress testing

c) Soak / endurance testing

d) Data volume testing

Load testing:-

In this test we verify application allow customer expected load (concurrent

users) on customer expected configuration system or not

In this test we use with in the load limit to estimate performance of an

application

Stress Testing: -

In this test we verify the peak limit of concurrent user to perform

operations on application

In this test we use beyond the load limit

Page 33: Manual testing materia with content pdf

33

Endurance test: -

In this test we verify how much time continuously we can perform

transactions on an application

Data volume testing: -

To verify how much data we can transfer in a specified time in terms of

kilobytes for second

Security testing:-

In this to verify privacy for end-user transactions in the application

Following are factors we validate in security testing

Authorization / Authentication: -

Verify app allows valid users & preventing invalid users or not

Access control: -

Verify app providing right services or not based on type of user

Session id: -

It is a dynamic value generated in application server when we login the

application

Verify session id will expire or not in following scenario

When system or application is ideal for some time after login the

application

When we log out application

When we click on back arrow in browser after login

Cookies: -

These are temporary files created in the local system when we access the

application

Verify cookies will expire or not when we close the application

Page 34: Manual testing materia with content pdf

34

Recovery testing: -

Verify how well application able to recover from abnormal state to normal state

Normal state

Abnormal state

Recovery procedures

Normal state

Note: - Based on expected interruptions developers will create recovery procedures

Ex: - Verify recovery copy mechanism for Ms Word application when system

suddenly restarted

Portability testing/compatibility testing:-

Verify application support customer expected operating systems, network

environment, compilers.

In compatibility test we use following testing techniques

Forward CT: - Verify application support higher version OS & browsers

Backward CT: - Verify application support previous / lower versions of OS &

browsers

Configuration testing: -

It is also called as hardware compatibility testing. In this test we verify

application able to support customer expected configuration systems or not

like RAM, HD….etc.

Installation testing: -

In this test mainly we verify application able to install as for “Read me”

File guidelines or not

Following are the other factors which we verify in installation testing

a) Setup programs availability

b) During installation any user interface messages from application

c) How much memory space it occupied in hard disk

d) All the related files are copied or not

Page 35: Manual testing materia with content pdf

35

e) How many instances we can install application

f) Repair or uninstall programs availability

g) Application name in all programs list

h) Quick launch icon on desktop

Sanitation testing / Garbage testing:-

In this test we verify any extra features are implemented in the

application, which are not specified in the user requirements

Note: - When we identify any extra features in application then we need to intimate to

the client based on his feedback we report to the developers.

Comparative testing/ parallel testing: -

Comparing our product with other existing competitive products in the

market in order to identify strengths & weakness of our product is called

“comparative testing”

Based on customer expectations we can give some of the non-functionality

testing techniques

User acceptance testing: -

It is perform after system testing by client team

The objective of UAT is to check application ready for release or not

In UAT we use 2 types of testing techniques

1) Alpha testing

2) Beta testing

Alpha testing: -

It is performed at developer’s side in control environment, if any defects are identified

those are resolved immediately and should get approval from the client

Beta testing: -

It is performed at client side in uncontrolled environment, if any defects are identified

those will be recorded & sent to the developers

Page 36: Manual testing materia with content pdf

36

SOFTWARE TESTING

Verification (Static testing) Validation (Dynamic testing)

(Are we developing right product or not) [Are we developed product

. Right or not)

{Preview, Walk through, Inspection

Requirement review, Design review}

WBT BBT UAT

Developers Test Engineer Client

Programming knowledge requirements knowledge Business knowledge

Internal structures of app system testing App ready for release or not

[Code coverage]

Functionality testing

Unit testing integration testing Non-Functionality testing

(High & low level)

Big Bang

Incremental

Alpha test (Dev.-con) Beta test (Cl-Un)

Page 37: Manual testing materia with content pdf

37

Some Other Testing Activities: -

1) Sanity testing

2) Ad-hoc / Random testing

3) Exploratory testing

4) Jump/monkey testing

5) Localization testing

6) Internationalization testing

7) Mutation testing

8) Re- testing

9) Regression testing

10) Maintenance testing

Sanity testing (build stable or not): -

It is also called as build verification test or tester acceptance test or stability test or

Build acceptance test.

It is an initial validation performed by testing team whenever build deployed

or installed at test environment

In sanity test we validate major functionalities of an application with valid data

& navigational flow of an application to confirm build is stable or not, further

activities( i.e. test cases execution or detailed validation)

Note: - If sanity test failed we suspend test cases execution & we rejected the build

to developers

For a rejected build developing team will release Patch files to the test engineers

Ad-hoc testing: -

It is an informal testing activity without proper planning and documentation, It is

performed like test engineer using their previous experience

In following scenarios we can perform Ad-hoc testing: -

1) When requirement documents are not cleared

2) When we want to validate already tested application for confirmation

3) It is also called as Random testing

Exploratory testing: -

Due to lack of domain knowledge by learning application functionality test engineer

validate those functionality

Page 38: Manual testing materia with content pdf

38

Jump / Monkey: -

Due to lack of time to validate major functionalities in the application.

In this scenario we select test cases based on priority

Localization testing:-

In this test to verify application support customer expected local languages or not

Ex:- Verify WWW.google.co.in web application will support customer expected

Indian local languages or not

Internationalization testing: -

In this test to verify app support international languages, currency and

time zone as for customer expectations

Mutation testing (senior developer): -

It is performed by senior developer by changing logic in the program or

unit to estimate unit testing team performance

Note: - Defect seeding: - To estimate unit testing team efficiency, injecting known

defects is called “Defect seeding”

Re- testing: -

It is performed on modified build to check defects are resolved or not

Note: - In sometimes when we validate same functionality with multiple set of test

data is also called as Re-testing activity

Ex: - Validity login functionality with multiple set of test data

Regression testing: -

It is performed on modified build to find any side effects on existing

functionalities with respect to modifications

Following scenarios we perform regression testing: -

1) When defects are resolved in modified application (partial regression testing)

2) When new features are added in existing system based on change request

From client {complete regression testing}

Page 39: Manual testing materia with content pdf

39

{Regression testing}

{Re testing}

Defect Reporting

Re and Regression testing

Maintenance testing:-

When new functionalities added in the system based on

change request ,those we validate in this testing

FRS T.E

Prepare

test cases

Build Dev.Tm

Passed Failed Developers

Modified

Build

Page 40: Manual testing materia with content pdf

40

Software Testing Life Cycle

S/W testing is one of the phase in SDLC,whereas40% of activities performed

in testing of SDLC

S/W testing is important to validate application in order to deliver reliable

application to the customer

STLC describes set of activities performed by separate testing team in order

to validate application as for client requirements.

Following are the phases involved in STLC: -

a) Test initiation

b) Test planning

c) Test case design

d) Test execution

e) Defect reporting

f) Test closure

Test initiation: -

After getting a new project for testing test manager will initialize test activities by

preparing test strategy document

B.R.S & F.R.S

Organization {Standards

And available resources}

Test strategy: - It is an organization specific document which is prepared early

stages of project validation

It describes testing approach, standards to be followed during testing &

available resources to validate application

Following factors analysed to prepare test strategy Documents:-

a) Application analysis using requirements documents like BRS & FRS

b) Organization standards like CMM levels

c) Available resources for Automation testing , defect reporting , configuration

management .

d) Risk analysis :-

i) Technology Risk: - Problems related to S/w & H/w

TM Test Initiation Test Strategy Doc

Page 41: Manual testing materia with content pdf

41

ii) Resource risk: - Availability of test engineers

iii) Support risk: - Availability of clarification team

iv) Scheduled Risk:- Factors which effect schedules to perform testing

. Activities

Test planning:-

In this phase test lead will responsible to prepare test plan document.

For any type of application a clear test plan document is mandatory for successful

validation of the application.

Test plan:-

It is a build oriented document

Build to build information may changes in test plan document i.e.

“Dynamic Document”

It describes testing approaches, test environment, resources, test

engineer names, allocated task, schedule….. etc.

B.R.S & F.R.S

Test Strategy Doc

Build # N

Components in Test Plan Document:-

1) Title:- Name of the project.

Ex;-HDFC Bank System plan V1.0

2) History of the document:-

It describes author of the test plan, when test plan document prepared,

when review conducted, who conducted review.

Introduction:-

a) Project overview:- It describes brief description about project

b) Purpose of test plan:- It describes core intension of testplan

TL Test planning Test Plan Document

Page 42: Manual testing materia with content pdf

42

c) Referred document:- It describes which documents are referred to prepared

test plan

In general requirement documents are referred.

Ex: - HDFC BANK-BRS V1.0

HDFC BANK-FRS V1.0

d) Scope:-

1) In scope:- It describes possible testing activities under current test plan

document

2) Out of scope:- It describes which testing activities are not possible under

current test plan document

1) Features to be tested;-

It describes available module& functionally in current build to validate.

5) Features not to be tested: - It describes which modules & functionalities are not

required to validate in current build.

6) Entry Criteria and Exit Criteria:-

Entry criteria:- It describes when we can perform test cases execution for dynamic

testing

Following are the entry criteria conditions:-

All the test cases should be prepared & reviewed

Build should deployed (or) installed at test environment

Build should pass sanity test

Test sets (or)test suits should be prepared

Test environment setup should be completed

Exit criteria:- It describes when we can stop testing.

Following are the exit criteria conditions:-

All the test cases should be executed at least once

certain % of test cases should passed(92-95)

Defeats which are identified during execution those defects status should be

“closed” or “Differed”

Budget &time constraints

When UAT completed

Page 43: Manual testing materia with content pdf

43

7) Suspension & Resumption criteria:-

Suspension criteria:-

It describes when we can suspend of test execution activities

Following are the suspension criteria conditions:-

1) When build failed sanity test

2) When there is change request from the client

3) When there is a “showstopper/ fatal/critical defects”

Resumption criteria: -

It describes when we can resumed our test execution

Following are the resumption criteria conditions: -

When patch is release for rejected build

When requirements are refined based on change request acceptance

When showstopper defects are resolved

8) Testing approach: -

It describes testing techniques to validate corresponding test factors

Ex: -

Sanity test to check stability of build

User interface testing to check look & feel of an application

Functionality testing to check application behaviour based on user operations

9) Roles& Responsibilities:-

It describes test engineers names & allocated task.

10) Test Deliverables:-

It describes which documents we need to prepare during testing & which we deliver

to the client

Page 44: Manual testing materia with content pdf

44

Ex:-

Test cases review report

Requirement traceability matrix (RTM)

Defects profile

Test summary report

Test Environment:-

It is also called “Test bed”

it describes S/W &H/W configuration in the system to perform test execution

In general test environment will be designed based on client live environment

Risks & mitigations:-

It describes possible risks at test environment & solutions for those risks

Training needs:-

It describes required training sessions for testing team

Schedule:-

It describes time lines to perform each testing activity.

NOTE:-After preparing test plan document test lead will conduct review on test plan

document along with senior teat engineers, after review test plan document deliver to

the test engineers.

Test Case Design:-

In this phase test engineer is responsible to derive test cases for allocated module.

B.R.S & F.R.S

Use Case Doc

Test Scenarios

TE Test Case Design Test Case Design Doc

Page 45: Manual testing materia with content pdf

45

Test case:-

It consist set of steps (or) sequence of steps with the user action and

subsequent response from the system

Test case describes validation procedure of the each functionality in the

system

Every test case should have “Pre-Condition” (when to execute Tc)

Pre-condition:-

It describes things to ensure in AUT before execution of Test case [i.e. When to

execute Test case]

Ex:- 1) verify delete mail functionality in Gmail inbox

Precondition:-

There should be some mails exit in Gmail inbox

Ex:- To verify login validation

Precondition:-

There should be some existing users for that application.

Inputs to Derive Test cases:-

1) Requirements like BRS & FRS

2) Use Cases:-

It is an optional document, when requirements are complex to understand

in FRS then they provide use cases for us.

Use case describes functionality behaviour from user prospective in

terms of “Actors action” &“system response”

“Better under standbility of FRS we use cases”

3) Test scenario:-

“It describes functionality /Requirement /Test condition which we need to validate”

Note:-

Test scenarios are not mandatory for projects

Advantages of Test Scenarios:-

Page 46: Manual testing materia with content pdf

46

Based on test scenario we can identify number of test cases need to be

drafted

it will give the complete test coverage with the test cases

Note:- A scenario can have one (or) more test cases.

Difference among the use cases, test scenario, test case

Use Case Test Scenario Test Case

It describes the functionality behavior {How system Should work}

It describes the test condition or a requirement to validate {What to test}

It describes the Execution/validation Procedure[How to Test]

Fundamentals of Test Cases:-

Every test case should be simple, easy and understandable

There is no duplicate test cases

Test cases should be consistent

Following are the Test case Design Techniques:-

Boundary value analysis (range or size)

Equivalence class partition (data type)

Error Guessing:-

It is an experience based technique there is no specific format.

In general Domain experts/Functionality expert will identify possible

number of test cases related to error guessing techniques.

Types of test cases:-

There can be 4 types of test cases

1) User interference TC’S:-

These tc’s are derived to verify look &feel of the application

Ex;- Availability of components, front size or front style… etc.

2) Usability Test cases:-

These TC’S are derived to verify user friendliness of app to perform

operations

Page 47: Manual testing materia with content pdf

47

Ex;- Error messages, tab implementation, function key implementation…etc.

3) Validation TC’S:-

These TC’S are derived to validate application input object

These also called as field level validation test cases

Ex:- List box, Edit box.

4) Functionality Test cases:-

These test cases are derived to validate application behaviour based on

user operations.

Ex:- Push button, image link, text link

Note:-

Usability, uses interface and validation Tc’s will be part in functionality Tc’s

based on functionality & validation there can be 2 types of Tc’s

Positive Test cases:-

Test cases which have the primary flow of events or +ve flow of events to

validate

Ex:- Test case with valid inputs

Negative Test cases:-

Test cases which have the –ve flow of events or alternative flow of events to

validate.

Test Case Template:-

In general organizations maintain their own templates to derive test

cases.

Template contains predefined components to prepare test case

Project: - Name of the project

Module: - Name of the module

Created by: - Name of the test engineer.

Created on: - On which date test cases are derived.

Reviewed By: - Person responsible to conduct review on Test cases.

Review on:- On which date review conducted.

Referred documents: - i/p s to derive Test cases like requirement document.

Page 48: Manual testing materia with content pdf

48

Test case Id (or) Name:-

It is a unique identifier for Test case which should be unique for

entire project

In general provide name for a Test case we should follow some

format:

Tco1-Project Name-Module-Functionality.

Test case Description:-

It describes core intension or objective of test case.

Priority:-

Describes importance of test case for execution

Priority is derived based on importance of functionality with respect to client

business needs

There can be 3 levels of priority

1) High

2) Medium

3) low

Advantages of Priority:-

Due to lack of time we can execute first high priority Test cases and next

level priority Test cases.

Step Name:-

It describes no of steps in a test case

Ex:- step1, step2…….. Etc.

Test Data:-

It describes i/p provided to the system during execution of test case.

Test Engineer need to identify required test data and that should be

provided under “Test Data” column.

Step Description:-

Describes user action /operation to be performed on the application during

execution of test cases.

Note:- In a test case every step should describe at least one user operation and

subsequent response from the system.

Page 49: Manual testing materia with content pdf

49

Expected Result:-

It is an expected behaviour of the Application Under Test(AUT)

Expected result will be drafted from the client requirements.

Actual Result:-

It describes actual behaviour of the Application Under Test(AUT) or System

Under Test(SUT)

Actual result will be drafted from AUT

Status:-

It describes status of steps in terms of pass or fail.

Pass:-

When expected result matches with actual result

Fail:-

When expected result not matches with actual result

Note:-

Actual result & status both are provided during Test cases execution time.

QC path /subject:-

In general we derive Test cases in excel sheet and then we export those

test cases into QC.

Note:- To export Test cases from Excel sheet to Quality centre we should have

Excel Add-In’s for quality centre.

EX:- Write test case to verify delete functionality in Gmail- inbox.

Project : Gmail Reviewed By :

Module : Inbox Reviewed On :

Created By : Shiva Referred Doc : Gmail-BRS v1.0 Gmail-FRS v1.0

Created On : _/_/_

Page 50: Manual testing materia with content pdf

50

Ex:- Write Test cases for a login window functionalities validation in flight reservation

application.

Business Rules:-

1) Application path ”c:\program files\Hp\Quick test professional\samples

\fight\app\flight4a.exe”

2) Login window should contain “Agent name:”, “password:”,edit boxes

,ok,cancel and Help buttons& flight image

3) Existing Agent name =”Ravi”& password=”mercury”

4) Cancel button to close login window

5) Help button to get information based on control in agent name & password

fields.

Test scenarios:-

Tso1:- Launching application

Tso2:- Login validation

Tso3:- closing login window

Tso4:- Help functionality

Test Case Id/Name

Test Case Description

Priority Step Name

Test Data Step Description Expected Result

Tco1-Gmail-Inbox-Delete mail Functionality

This test case to verify delete mail functionality

Medium Step1 Select one mail and click on delete

Selected mail should be deleted

Step2 Select More than one mail and click on delete

Selected mails should be deleted

Step3 Open mail and click on delete

Opened mail should be deleted

Step4 Select mails read/unread criteria

Corresponding mails should be deleted.

Step 5 Click on delete without select

Error message should be popup

Step6

Select all the mails and click on delete

All mails should be deleted

Page 51: Manual testing materia with content pdf

51

Test Cases Review:-

After writing test cases review will be conducted to check correctness &

completeness of the Test cases.

Test cases review will be conducted in 2 levels

Peer Review:-

Within the TE’s by interchanging they written test cases review will be

conducted .

Test Lead’s Review:-

After peer review test cases send to test lead where Test Lead will conduct

review to conform written test cases are sufficient or not to validate the

application.

Requirement Traceability Matrix:-

During Test cases review we prepare RTM/TM in order to analyse all the

requirements covered or not with written test cases.

To prepare RTM/TM we map the each Test case to the corresponding

requirement.

Ex;- Prepare Traceability matrix for login window Test case coverage

Test Execution:-

After test case review test lead will give the permission to execute test cases

in order to validate application

In generally test cases are not executed individually, whereas we prepare

Test set /Test suits/Test Batch and those are taken for execution activity.

Req.ID Test Scenario/Requirement TC ID/Name TC Description

Page 52: Manual testing materia with content pdf

52

Test Set:-

It is a collection of Test cases based on the application flow or

dependency of functionalities in application.

Test Set will help to analyse test execution coverage & it is easy to

execute test case.

Test Execution Flow:-

Defect

Fixing

Release patch

Following are the Test execution Levels:-

Level-o: Sanity/Smoke Test:-

It is performed whenever build deployed /installed at test

environment.

Developers Testing Team

Level-0

Sanity/Smoke

Level-0

Real/Compreh

ensive Testing

Failed

Mismatch

Level-2

Re& Regression Testing

Modified Build

No mismatches

Level-3 : Final

Regression Testing

UAT

Test Closure

Page 53: Manual testing materia with content pdf

53

In this test major functionalities we validate to conform build is

stable or not for further testing activities.

Level-1: Real Testing/Comprehensive Testing:-

When build is stable then we execute test cases using test sets/test

suits in order to validate application.

Following are the activities performed by Test Engineer during test

cases execution:[Manual Testing Approach]

Step1:- Perform operation on AUT as for “Step description” in the test case

Step2:-Verify actual behaviour of application with respect to “Expected Result” in .

test case.

Step3:- Record actual behaviour of AUT under “actual result” column & provide . .

. “status” for a step.

Following are the statuses for Test cases:-

Passed:- When excepted results matching with actual results

Failed;- If any deviation b/w expected result & actual result

Blocked;- Due to parent functionality failure, when Test case not possible to execute

. in the current test then provide status as Blocked.

Not Completed:- When some of the status in a test case not at executed.

No Run:- When the test case not at executed, it is a default status for a test case.

Level 2:Re& regression testing :-

Re testing:- It is performed on modified build to ensure defects are resolved or not.

Regression Testing:- It is performed to find any side effects on existing system with

respect to modifications.

Note:- Re-testing & Regression testing will be performed in the same level.

Level3: Final Regression Testing:-

It is also called as “Pre-Release testing or “End to End scenario

testing”

In this test some of the functionalities are validated from login

session to logout session based on defect density during

comprehensive testing.

Page 54: Manual testing materia with content pdf

54

Defect Density:-

Number of defects identified in a specific area or module

Defect Reporting:-

During Test cases execution if any defects are identified then Test Engineer

should prepare defect profile and those are sent to the test lead and

developers.

In general defects can reported manually or using some of the defect

reporting tools like Quality centre is a HP product, Clear Quest is IBM product

Defect Template:-

1) Defect ID:-

2) Reported By:-

3) Reported On:-

4) Reported To:-

5) Test Set:- Name of test set in which test case fail, Test case id/Name: Name

. of the test case along with the step which have the deviation.

6) Project: - Name of the project

7) Module: - In which module defect identified.

8) Build version Id: - Version number

9) Reproducible defect: - (yes/no)

When defect is identified then test engineers has to recheck the

same functionality , If defect is raised again then it will be

considered as “Reproducible Defect” During rechecking the functionality defect is not raised then it will

be consider as not reproducible defect

Note :- When defect is reproducible then we need to provide more information to

developers like defect snapshot& reproducible steps in a test case

10) Priority:-It describes importance of the defect to resolve , When as priority

defined importance of the functionality in which we identified defect

There can be 3 levels of priority

a) High

b) Medium

c) Low

Page 55: Manual testing materia with content pdf

55

11) Severity:-

It describes seriousness of the defect in the system functionality to

execute other test cases.

It also describes the impact of the defect in the system functionality

There can be 4 levels of severity

1) Critical fatal: - There is no work around to continue the test execution without

. solving those defects

2) High: - It is a high functionality defect where we have work around to

continue test execution

3) Medium:- For medium and low functionalities defects

4) Low: - For look & feel and usability related defects.

Note: - Priority and severity both are called “Defect parameters based on that

developers will identify which defects they need to resolve first.

12) Status: - It describes scale of the defect, In general when we report 1st time

to developers which have the status as “New”

13) Summary: - It describes brief description about the defect

Summary

Defect ID : Project :

Reported by : Module :

Reported on : Build :

Reported to : Version ID :

Reproducible : Priority :

Severity :

Defect life cycle: -

It describes various status of defects from it identification to closed.

(Or)

The time gap between defect identification time to defect closed time is called

”Defect Life cycle”

Following are the status provided by Test Engineer: -

New: - When 1st time we reporting about deviation to developers Closed: - When test

engineer satisfied about defect resolve work in modified build .

Page 56: Manual testing materia with content pdf

56

……………………………Development Team…………………………………

Start

Is it a

new test

Select the test case

for execution

Analyse

Result

Prepare a defect report

with Status=New and

send to Dev. Team

Is it in

scope

Is it already

reported?

Status=Rejected Status=Duplicate

Defect accepted by Dev.

Team: Status=Open

Dev. Team Resolves the

defect: Status=Fixed

End DLC End DLC

Document the Result

and move to next test

case

Re testing and

Regression testing Is Defect

Fixed?

Defect can be closed

Status=Closed

Need to Reopen the

defect: Status=Reopen

Page 57: Manual testing materia with content pdf

57

Following are the status provided by Developers:-

1) Duplicate: - When defect is similar to earlier defect

2) Add: - When developers require more information about defect.

3) Rejected: - When developers are not accepting that as a defect.

4) Defect: - Developers are accepted that as a defect but those defects will be

consider in the next release of the application to the customer due to low

priority & low severity

5) Open: - When developers are accepted i.e. defect & they are ready to resolve

6) Fixed: - When defect is resolved in modified build.

7) Defect age: - The time gap b/w the defect identification date to closed date.

8) Later defect: - Defects which are identified in the later stages of testing .

9) Marked defects: - some defects may hide some other defects

10) User acceptance test completed: -Test lead will analysed exit criteria

condition which are specified in the test plan to stop the testing activities.

Test closure: - It is performed by test lead after execution of all the test cases

& user acceptance test completed.

Ex: - 1

Test cases for 1litere bottle.

Verify company name

Verify quantity as 1litere or not

Verify price of water bottle

Verify bottle sealed or not.

Verify manufacturing &expire date.

Verify purity of water

Verify colour of water bottle

Verify cooling capacity of water bottle

Ex:2

Test cases for 4 GB Pen drive

Verify company name

Verify memory capacity

Verify price of pen drive

Verify warranty for a pen drive

Verify model of pen drive

Verify pen drive detecting or not

Page 58: Manual testing materia with content pdf

58

Verify operations on pen drive like copy, paste, Delete, save & format

Verify pen drive allowing 4 GB data or not to save

Verify pen drive providing any information or not when we try to

save more than 4GB data

Verify safe removal operation

Verify impact on existing data when pen drive removed during

process

Test cases for a Lift which allows maximum 6 persons

Verify lift allows 6 persons or not

Verify doors will close automatic or manual

Verify power consumption

Verify lift providing any information or not when it is over loaded

Verify lift should not move when doors not closed properly

Verify lift will stop correct floor level or not

Verify lift providing correct floor numbers or not

Verify lift providing any information or not on which direction it is moving

Verify possibility to open doors in emergency

Verify internal switch board working condition

Verify how lift will stop when we operates different floor numbers randomly

Verify light working condition.

Ex: - TC’s for following components

1) Teacup 2) Pen 3) watch 4) Mobile 5) Phone 6) Fan

Ex: - TC’s for employee search page business needs

1) Application URL: - www.Empsearch.com

2) Employee search page should contain Emp.Name:,Emp.Dept.:,Date of

Joining(From: and To: ) input fields,Search,clear button and search result

table.

3) We can search employee record using either Emp name or Emp.Dept or DOJ.

4) Clear button should clear the data.