63
YILDIZ TECHNICAL UNIVERSITY FACULTY OF ELECTRICAL AND ELECTRONICS ENGINEERING COMPUTER ENGINEERING DEPARTMENT SENIOR PROJECT TEST REPORTING CENTER Project Supervisor: Assist.Prof. Dr. Yunus Emre Selçuk Project Group 02011029 Emrah İLERİGELEN

TEST REPORTING CENTER

Embed Size (px)

Citation preview

Page 1: TEST REPORTING CENTER

YILDIZ TECHNICAL UNIVERSITYFACULTY OF ELECTRICAL AND ELECTRONICS ENGINEERING

COMPUTER ENGINEERING DEPARTMENT

SENIOR PROJECT

TEST REPORTING CENTER

Project Supervisor: Assist.Prof. Dr. Yunus Emre Selçuk

Project Group

02011029 Emrah İLERİGELEN

Istanbul, 2011

Page 2: TEST REPORTING CENTER

All rights reserved to Yıldız Technical University, Computer Engineering Department.

Page 3: TEST REPORTING CENTER

CONTENT

ABBREVIATION LIST...................................................................................................iv

FIGURE LIST...................................................................................................................v

TABLE LIST...................................................................................................................vii

ABSTRACT...................................................................................................................viii

ÖZET................................................................................................................................ix

1. INTRODUCTION.........................................................................................................1

1.1. Software Test..........................................................................................................1

1.1.1. Test Scenarios (Test Cases).............................................................................1

1.1.2. Defect..............................................................................................................1

1.2. Software Test Progress...........................................................................................1

1.2.1. Defect Life Cycle............................................................................................2

1.3. Existing System......................................................................................................3

1.3.1. Technoport.......................................................................................................3

1.3.2. Quality Center.................................................................................................5

2. FEASIBILITY...............................................................................................................8

2.1. Microsoft Visual Studio 2010................................................................................8

2.2. Microsoft SQL Server 2008 Express Edition.........................................................8

2.3. Microsoft Access....................................................................................................8

2.4. Technical Feasibility..............................................................................................8

2.4.1. User..................................................................................................................8

2.4.2. Developer........................................................................................................9

2.5. Economic Feasibility............................................................................................10

2.5.1. User................................................................................................................10

2.5.2. Developer......................................................................................................11

3. SYSTEM ANALYSIS.................................................................................................12

3.1. System Analysis Overview...................................................................................12

3.2. Project Plan...........................................................................................................14

4. SYSTEM DESIGN......................................................................................................16

4.1. User Authentication and User Roles....................................................................16

4.1.1. Admin............................................................................................................16

4.1.2. Tester (User)..................................................................................................16

Page 4: TEST REPORTING CENTER

4.1.3. Visitor............................................................................................................16

4.2. Main Menu...........................................................................................................17

4.3. Defects..................................................................................................................19

4.3.1. Production Defects........................................................................................19

4.3.2. Open Defects.................................................................................................20

4.3.3. Hot Defects....................................................................................................21

4.4. Projects / Demands...............................................................................................22

4.5. Defect KPI............................................................................................................23

4.5.1. Weekly Defect Trend:...................................................................................23

4.5.2. Defect TurnAraund Duration:.......................................................................25

4.5.3. Number Of Defects:......................................................................................26

4.5.4. Defect TurnAround Duration by Department:..............................................28

4.5.5. Avarage Time For Defect States:..................................................................29

4.6. Weekly Report......................................................................................................31

4.6.1. New Weekly Report......................................................................................32

4.6.2. View Weekly Reports....................................................................................34

4.7. Tasks.....................................................................................................................34

4.7.1. Create Tasks..................................................................................................34

4.7.2. Show Tasks....................................................................................................35

5. Conclusıon...................................................................................................................37

5.1. Possible Developments.........................................................................................38

REFERENCES................................................................................................................41

CURRICULUM VITAE.................................................................................................42

iii

Page 5: TEST REPORTING CENTER

ABBREVIATION LIST

TRC Test Reporting Center

QC HP Quality Center

KPI Key Performance Indicators

DB DataBase

SP Stored Procedure

MB MegaByte

iv

Page 6: TEST REPORTING CENTER

FIGURE LIST

Figure 1.1 Basic Test progress..........................................................................................2

Figure 1.2 Defect Life Cycle.............................................................................................3

Figure 1.3 Technoport Task Fields 1.................................................................................4

Figure 1.4 Technoport Task Fields 2.................................................................................4

Figure 1.5 Technoport Task Fields 3.................................................................................5

Figure 1.6 Quality Center Defect Fields............................................................................5

Figure 1.7 Quality Center Required TestCase Fields........................................................6

Figure 1.8 Basic Test progress..........................................................................................6

Figure 3.1 System DB Overview....................................................................................12

Figure 3.2 System DB Overview 2.................................................................................14

Figure 3.3 Project Plan....................................................................................................15

Figure 4.1 Production Defects Page................................................................................19

Figure 4.2 Production Defects Page - Filter....................................................................20

Figure 4.3 Production Defects Details Page....................................................................20

Figure 4.4 Open Defects Page.........................................................................................21

Figure 4.5 Open Defects Page - Filter.............................................................................21

Figure 4.6 Hot Defects Page............................................................................................22

Figure 4.7 Project / Demand Page Filters........................................................................22

Figure 4.8 Project / Demand Page...................................................................................23

Figure 4.9 Weekly Defect Trend.....................................................................................24

Figure 4.10 Weekly Defects............................................................................................24

Figure 4.11 Defect Turnaround Time..............................................................................25

Figure 4.12 Defect Turnaround Time..............................................................................26

Figure 4.13 Number of Defects.......................................................................................27

Figure 4.14 Number of Defects By Departments............................................................27

Figure 4.15 Defect Turnaround Duration by Department...............................................28

Figure 4.16 Defect Turnaround Duration by Department...............................................29

Figure 4.17 Avarage Time For Defect States..................................................................29

Figure 4.18 Avarage Time For Defect States..................................................................31

Figure 4.19 Weekly Reports............................................................................................32

Figure 4.20 Weekly Reports – Item inserted...................................................................33

v

Page 7: TEST REPORTING CENTER

Figure 4.21 Succesful Message.......................................................................................33

Figure 4.22 Succesful Message.......................................................................................33

Figure 4.23 View Weekly Reports..................................................................................34

Figure 4.24 View Weekly Reports..................................................................................35

Figure 4.25 View Weekly Reports..................................................................................35

Figure 4.26 View Weekly Reports..................................................................................36

vi

Page 8: TEST REPORTING CENTER

TABLE LIST

Table 2.1 User Computer Hardware Requirements..........................................................9

Table 2.2 Visual Studio 2010 hardware requirements......................................................9

Table 2.3 System requirements for MS-SQL 2008.........................................................10

Table 2.4 Developer System Requirements....................................................................10

Table 4.1 User access to pages........................................................................................18

Table 4.2 Responsible Groups for Defect Status Changes..............................................30

Table 4.3 Responsible Groups for Defect Status Changes (Production Defects)...........30

vii

Page 9: TEST REPORTING CENTER

ABSTRACT

Preparing a neat and orderly report of a job well done is vital for any line of business.

However these reports must be standardized and should be both easy to prepare and

easily legible. Therefore, most companies prepare templates for work reports. When

preparing these templates, the aim is to shorten and optimize the reports while

shortening the time to prepare them.

However if the said work covers many applications, the report becomes both much

more labour-intensive and is more likely to suffer from mistakes. What we should do is

to automatically bring together the data in the applications via common variables that

facilitate interaction between the applications, and use this raw data to prepare the

report.

The application developed here makes it possible for the user to keep abreast of the

work done from the test development and job assignments to the conclusion and report

preparation.

viii

Page 10: TEST REPORTING CENTER

ÖZET

Üzerinde zaman ve emek harcanan işin düzgün ve anlamlı bir şekilde raporlanması,

yani kişinin kendi işini gösterebilmesi bir çok meslek için olmazsa olmaz niteliklerden

biridir. Ancak bu raporların standartlaştırılması, hazırlayan için de, inceleyen için de

kolay olması, anlaşılabilir olması gerekir. Bu sebeple bir çok şirkette raporlar için çeşitli

şablonlar oluşturulur. Bu şablonlar oluşturulurken amaç raporun en kısa şekilde isteneni

verebilmesi ve mümkün olduğunca az zaman almasıdır.

Ancak iş farklı bir çok uygulama üzerinde dağıldıysa bu raporlardaki insan emeği ve

dolayısı ile insan hatası artar. Burada yapılması gereken şey uygulamalardaki verileri,

uygulamalar arasında anlaşmayı sağlayacak ortak bileşenler üzerinden otomatik olarak

birleştip ham veriyi oluşturmak ve bunun üzerinden gerekli raporu hazırlamaktır.

Proje geliştirme sürecinin doğasından dolayı tüm bir geliştirmenin takibi işin en

sonunda yani iş test sürecindeyken önem kazanır. Proje başlarında uzun sürelere yayılan

raporlamalar, iş test aşamasındayken günlük, haftalık, aylık raporlar halinde sürekli

olarak izlenir. Bu sebeple bazı projelerde raporlamaya ayrılan zaman işi geciktirme

noktasınna gelebilir.

Geliştirilen uygulama test gelişim aşamasında iş tayin edilmesinden işin bitirilip

raporlanmasına kadar gerçekleşen sürecin takibini sağlamaktadır.

ix

Page 11: TEST REPORTING CENTER

1. INTRODUCTION

Testing is the last period of a development progress and the most important period that

can not afford any mistakes. An analyser can make mistakes, a developer can make

mistakes but if a tester makes one this means that customers will face it.

Due to this risky situation Testing progress is the favorite period of a director to ask for

a report. Reporting is sometimes important than working. Because reporting is the only

way to show the work. There are plenty of tools for companies to measure the

employees effort. This Project is about a test team reporting in a telecommunication

company.

1.1. Software Test

A software is a combination of documentation, performing procedure and coding. Test

progress can be defined as controlling the code according to the documentation and

operating procedure of a code. [3]

1.1.1. Test Scenarios (Test Cases)

A test case in software engineering is a set of conditions or variables under which a

tester will determine whether an application or software system is working correctly or

not. [4]

1.1.2. Defect

A defect is the common term used to describe an error, flaw, mistake, failure, or fault in

a computer program or system that produces an incorrect or unexpected result, or causes

it to behave in unintended ways. [4]

1.2. Software Test Progress

A basic test progress is given in Figure 1.1. This figure shows defect handling in test

progress. As you can see if a defect is detected on a test case, after defect is fixed, this

case is retested again. And if defect continues or another defect is detected related to the

same case, defect is assigned to the developer again and again retested.

Page 12: TEST REPORTING CENTER

Figure 1.1 Basic Test progress

1.2.1. Defect Life Cycle

A basic Defect life cycle is given in the Figure 1.2. A defect is opened in “New” status,

If the Developer agrees on defect defect status is changed into “Open” status. After

developer fixes the defect, defect is assigned to the tester in “Fixed” status. Tester

retests the defect and if defect is fixed, defect status is changed to “Closed”.

After defect is detected by tester, if developer does not agrees in defect (sometimes

caused by wrong test case or invalid test data) developer changes defect status into

“Rejected” and assigns it back to tester. Tester retests the defect and if error continues

defect status is changed to “Reopen” and assigned to developer again.

In some cases this progress takes long time and defect status changes too many times.

2

Page 13: TEST REPORTING CENTER

Figure 1.2 Defect Life Cycle

1.3. Existing System

In Existing system two tools are used for test management. First tool is Technoport, for

assigning the requests (test requests in this project). Second tool is HP Quality Center,

for running the test cases and defect management.

1.3.1. Technoport

Technoport is a request assignment tool. All requests can be handled with this tool in a

company but this project is focused on test requests. A request has the attributes are

given in Figure 1.3, these fields are changed by the person who creates the request.

Request id: unique id of the request

3

Page 14: TEST REPORTING CENTER

Request Status: The status of the request, shows if the request is picked up or

completed etc.

Brief Description: The summary of the request

Assigned to: The person to whom the request is assigned to.

Assigned by: The person who creates the request.

Requested Finish Date: The deadline of tests.

Figure 1.3 Technoport Task Fields 1

There are some other fields that can be changed by assigned person. The fields are given

in Figure 1.4 are used for planning of the test request:

Figure 1.4 Technoport Task Fields 2

The fields given in Figure 1.5 below are used for actual effort and actual start and finish

dates:

4

Page 15: TEST REPORTING CENTER

Figure 1.5 Technoport Task Fields 3

1.3.2. Quality Center

Quality Center is a testing tool used by Testing and Quality team to run tests, assign

defects and reporting. It is one of the most used tools for this aim. User imports test

cases and runs these tests. If the system does not behave as the requirements, a defect

must be created. According to test results and defects reports are generated.

Figure 1.6 Quality Center Defect Fields

The problem on current system is that there is no connection between Technoport and

Quality Center, so it is impossible to monitor the testing progress on one tool. Request

monitoring and test progress monitoring are performed on different tools, and there are

5

Page 16: TEST REPORTING CENTER

always inconsistancies between these tools. (test start dates on QC and actual start date

on Technoport etc.)

To relate the defects to technoport “Technoport ID / Project Name” field is added to

defect properties. Also “Fixed by Group” and “Node” fields are added for Defect KPI

Reports. And “Hot Defect” field is added for daily hot defect reports.

Figure 1.7 Quality Center Required TestCase Fields

Requirements field is used for another purpose. Technoport ID / Porject Name field is

added for reporting.

Figure 1.8 Basic Test progress

Some test requests are Demands, small requests that are not managed by a project

manager. And test progress is between 1 and 10 days. There is no need to open a new

schema on QC for these requests. Because a schema has more than 30 tables and needs

at least 40 MB of disc space. Demands are stored in folders in the same schemas. Also

Phases of the same projects are stored in the same schema.

6

Page 17: TEST REPORTING CENTER

It is nearly impossible to select the test cases in the same parent folder. Because it is not

possible to estimate how deep the folder hierarchy. That is why additional (and

required) field Technoport Id / Project Name is added to all schemas on QC.

QC DashBoard:

QC DashBoard is an implemented test reporting tool that user can define portlets to

generate reports. But in order to generate reports about a project user has to enter a

project, QC DashBoard is not useful for generating one report on more than one

projects.

In current system;

it is hard to monitor all projects and demands

it is impossible to find the test progress of a demand

it is hard to enter weekly plan reports

it is hard to publish all open defects

it is hard to report all small things that takes time and effort

it is hard to report defect KPI’s

7

Page 18: TEST REPORTING CENTER

2. FEASIBILITY

Feasibility study covers the software and hardware requirements for developing and

using this project.

For development Microsoft Visual Studio 2010, Microsoft SQL Server 2008 Express

Edition and Microsoft Access is chosen and the reasons are below.

2.1. Microsoft Visual Studio 2010

This is requested Development tool from customer company. Product license is

received from company.

2.2. Microsoft SQL Server 2008 Express Edition

Microsoft SQL Server 2008 Express Edition Service Pack 2 is a system of effective and

reliable data that offers a varied set of features, data protection and performance for

embedded application clients, Web applications and light local data stores. SQL Server

2008 Express, which is designed for easy deployment and rapid creation of prototypes,

is available for free redistribution with your applications and is also free. It is designed

to integrate perfectly with other server infrastructure investments. (1)

This tool is free to download, this is the reason tool is chosen.

2.3. Microsoft Access

According to system security, Technoport db access is not received. And access db that

is updated 2 – 3 times a week is given for this tool. This is a hard requirement for

software.

2.4. Technical Feasibility

This part covers developers and users hardware and Internet requirements and their

costs.

2.4.1. User

For users, it is enough to heve a computer that connects to Internet and an Internet

sonnection. In Table 3-1 the cheapest hardware is listed for using the system.

8

Page 19: TEST REPORTING CENTER

Table 2.1 User Computer Hardware Requirements

CPU 1.6GHz  800 Mhz FSB

RAM (MB) 512MB DDR 400Mhz

HDD (GB) 80 GB 7200 rpm

Graphic Card (MB) 128MB FX5200 AGP 8X

Keyboard - Mouse M.M. Klavye, Optic Mouse

Ethernet 10/100 Ethernet

Operating System MS Windows XP Home Operation System

Monitor 15’

Internet connection:

512 Kbps 3 Gb limitled

2.4.2. Developer

This part of feasibility is prapared due to system requirements of Visual Studio and MS

Sql 2008.

In Table 3-2 system Requirements of Visual Studio is Given.

Table 2.2 Visual Studio 2010 hardware requirements

CPU Minimum: 600 megahertz (MHz) Pentium processor

Recomended: 1 gigahertz (GHz) Pentium processor recommended

Operating System Microsoft® Windows® XP Professional x64 Edition (WOW)

Microsoft® Windows® XP Professional SP2

Microsoft® Windows® XP Home Edition SP2

RAM Minimum: 192 megabytes (MB)

Recomended: 256 MB

In Table 3-3 System requirements for MS-SQL Server 2008 is given. Developers

hardware feasibility is prepaired according to these data.

9

Page 20: TEST REPORTING CENTER

Table 2.3 System requirements for MS-SQL 2008

CPU Minimum: 600 megahertz (MHz) Pentium

Recomended: 1 gigahertz (GHz) Pentium or above

Operating System Windows XP + SP 2 or above

RAM 512 megabytes (MB) of RAM or above

Monitor Super VGA (1,024x768) or above

Others Microsoft Internet Explorer 6.0 sp1 or above

IIS 5.0 or above

ASP.NET 2.0

Table 2.4 Developer System Requirements

CPU 2.66 GHz, 533MHz, 2x1 MB L2 cache,

Bellek (MB) 1 GB DDR2 533 MHz

Sabit Disk (GB) 80 GB

Ekran Kartı 256 MB AGP Ekran Kartı

2.5. Economic Feasibility

This part is the cost analysis due to user and developers hardware and software

requirements.

2.5.1. User

For user it is enough to use a computer given in Table 3-1 there is no need ro ekstra

hardware or software equipment .

The computer in Table 5-1 costs 495$+KDV (1)

Internet connection costs 29 TL per month with Turk Telecom.

Computer: 730 TL

10

Page 21: TEST REPORTING CENTER

2.5.2. Developer

For Developer the computer on Table 5-4 , Internet connection and the software for

development will be calculated in this part.

The computer in Table 5-4 costs 849$+KDV

Internet connection costs 29 TL per month with Turk Telecom.

Computer: 1320 TL

MS Visual Web Developer 2008 Express Edition - Free

MS SQL Server 2008 Express Edition - Free

11

Page 22: TEST REPORTING CENTER

3. SYSTEM ANALYSIS

The Project that is developed provides the communication between these two tools. This

communication gives the opportunity to take analysis about task requests. The details of

Project are given below:

3.1. System Analysis Overview

Tool (Test Reporting Center) is a web application.

Technoport requests are imported to this application. Due to privacy policies, a local

MS Access database is created and this database gets data from Technoport DB.

Project names are entered to system – manual enterance of user, because project can

not be selected from Technoport DB

Reports are generated by a connection to Quality Center Database

Project progress are monitorred by a connection to Quality Center Database

Figure 3.9 System DB Overview

Database relations of Test Reporting Center is given above. TRC is able to create

connections to Oracle, SQL and Access databases.

12

Page 23: TEST REPORTING CENTER

Oracle connection is used to get project names, users and all reports about test cases and

defects.

SQL connection is used for creating and managing tasks and weekly reports.

Access connection is used to get Technoport requests.

Stored Procedures on QC Database:

Every Project has their own schema on QC DB. So, if a report about defects is being

selected from n projects, that means user has to select report from n*BUG table. In

order to increase performance three stored procedures are developed. Details of SP’s are

given below:

Test_Run_SP

This procedure selects bug and test run’s of projects and merges these data to

two tables. TEST_RUN table for testcases and test run. And BUG table for

bugs. This stored procedure uses TEST and TESTCYCL tables to select test

runs and test cases, and BUG table to select defects. These tables are used for

Completed and Ongoing projects overview status.

Bug_KPI_SP

This procedure selects bugs and bug history from BUG, AUDIT_LOG and

AUDIT_PROPERTIES tables and merges AUDIT_LOG and

AUDIT_PROPERTIES tables to Bug_History table, and creates Bug table on

home shema. Then calculates status change time differences and creates

Bug_KPI table and Bug_All_State_Durations table. These tables are used for

calculating durations for Development, Test and Operation groups actions on

Defect KPI reports.

Prod_BUG_KPI_SP

This procedure works the same as BUG_KPI_SP but on only

Production_Defects project. It is possible to disable this procedure and use

BUG_KPI_SP instead. But according to performance issues (Produstion Defects

bug KPI reports will be used more often than reqular project defect KPI reports)

these two SP’s are seperated.

13

Page 24: TEST REPORTING CENTER

Figure 3.10 System DB Overview 2

Test Management Changes:

For this Reporting tool some changes had to be made on test and defect management on

QC. In order to increase SP performances additional fields are added to projects. These

fields are given in Introduction part.

Adding these fields increased SP performances up to 80 percent.

3.2. Project Plan

Project plan continued from previous term

Phase 1

o This phase includes previous work that is given in detailed.

Phase 2

o The Modification of Phase1 is done

o Weekly Project Report is added (Details are given)

o Defect KPI’s are added

Phase 3

14

Page 25: TEST REPORTING CENTER

o The Modification of Phase2 is done (Defect Fixing, New requirements

etc)

o User Authentication is added

o Task pages are added

o Administration pages are added

Figure 3.11 Project Plan

15

Page 26: TEST REPORTING CENTER

4. SYSTEM DESIGN

System is designed as a web application and As mentioned in previous sections reports

are get from both Access and Oracle databases. According to security issues Access

database is updated manually from database itself. And according to performance issues

on QC database procedures are not executed by the application. In case procedures will

be executed with database jobs in certain periods.

Test_Run_SP is executed in every 30 minutes.

Bug_KPI_SP is executed in every 24 hours.

Production_BUG_KPI_SP is executed in every 24 hours.

4.1. User Authentication and User Roles

There are three different types of users in the system. These are admin, tester (user) and

visitor. The features provided for these users by the system are summarized below:

4.1.1. Admin

Admin user is the managers of the tool and these users are able to;

Generate weekly reports

Add/Remove/update users

Add/Remove/Update projects

Do everything that a Tester is able to do

4.1.2. Tester (User)

Tester users are the general users of the tool and these users are able to;

Create/Delete/Update tasks

View tasks

Create weekly reports

Do everything that a visitor is able to do

4.1.3. Visitor

Visitors are unauthorized users of the tool and these users are able to see;

Open Defects List

Hot Defects List

Production Defect List

16

Page 27: TEST REPORTING CENTER

Production and Project Defect KPI

Ongoing / Completed Projects and Tasks

4.2. Main Menu

Project is divided in to three phases to manage easily and for the fact that this project

plans had begun in 2009 and some parts are developed before (such as SP, Daily report

page etc.)

Main manu items are listed below:

Defects

o Open Defects

o Hot Defects

o Production Defects

Projects / Demands

o Project / Demand Reports

Defect KPI

o Production Defects

All State Durations

Defect turn Around Time

Avarage Time for Defects

Weekly Defects

Number Of Defects

o Project Defects

All State Durations

Defect turn Around Time

Avarage Time for Defects

Weekly Defects

Number Of Defects

Administration

17

Page 28: TEST REPORTING CENTER

o Tasks

Create Task

View Tasks

o Weekly Report

New Weekly Report

View Weekly Reports

o User Management

o Project Management

User access to these pages are given in the Table 4.1

If an authenticates user with tester priveledges tries to access to administration pages,

user gets the error “You are not authorized to view this page”

If a visitor tries to access to administration pages or tester pages, user is redirected to

login page. If user is logged in successfully, user is redirected to the page before.

Table 4.5 User access to pages

18

Page 29: TEST REPORTING CENTER

4.3. Defects

This menu has three pages about open defects. Details of pages are given below.

4.3.1. Production Defects

Production defects will be viewed and test group comment will be added. This page will

be able to view all visitors. Updated by testers and will be published by an admin.

Production_Defects is a project in QC, these defects are detected on production

environment and some of them are related to n requests in technoport (development

request, test request, previous test request etc).

These defects are reported twice in a week and whether it was not tested before there

must be a testers comment on every production defect.

In current system it is processed with excel files and it is difficult to manage all testers

and all defects.

Figure 4.12 Production Defects Page

Defects can be filtered on the conditions below:

- Assigned to

- Fixed by Group

- Status

- Severity

- Detected on Node

19

Page 30: TEST REPORTING CENTER

Figure 4.13 Production Defects Page - Filter

And if “detail” link is clicked, defect detail page is opened. This page is added to

prevent a report of so many colums make the report unreadable.

Figure 4.14 Production Defects Details Page

Domain name and project name is given for the users if they want to see all the details

of the defect, so that they can login to mercury and search for it.

4.3.2. Open Defects

Open Defect page is necessary for developers to manage the defects that are assigned to

themselves. There are more than 300 projects on QC and sometimes mail sending

function may not work, and a person is not informed with the defects. This Page will

allow users to filter all defects without entering all projects.

20

Page 31: TEST REPORTING CENTER

Figure 4.15 Open Defects Page

This page has the same filters as production defects and also the same details page. In

addition to production defects, Open defects (and also Hot defects) can be filtered by

“Project Name”.

Production Defects page does not need authentication.

Figure 4.16 Open Defects Page - Filter

4.3.3. Hot Defects

Hot defects are mostly high or urgent prioritized defects that block requests or projects.

These defects are reported by an excel file every morning.

In Testing Report Center (TRC) High, Very High and Urgent defects will be listed on

page, different than Open Defects page, all description and attachments will be reached

by clicking on defect.

In Bug table there is a user defined field for hot defects and if this field value is “Yes”

defect will be reported as Hot defect.

Hot defect report does not need authentication.

Figure 4.17 Hot Defects Page

21

Page 32: TEST REPORTING CENTER

Hot defects page can be filtered by the same fields as Open Defects.

Details page works the same as Open Defects and Production Defects.

4.4. Projects / Demands

This menu has one page for both projects and demands and ongoing and completed

demand and project reports can be taken on this page.

Figure 4.18 Project / Demand Page Filters

There are 3 filters.

If project is selected on first filter project names are filled to third filter from QC DB

else Technoport request ID and name is filled from Access DB.

Also ongoing or completed demand and project can be selected.

22

Page 33: TEST REPORTING CENTER

Figure 4.19 Project / Demand Page

Reports are taken on test runs and defects.

Test run table is created on status of test cases.

Defect table is created on status and severity of defects.

4.5. Defect KPI

Defect KPI reports are delivered weekly to all directories in company. Defect KPI’s are

detailed defect Reports by severity, Status and Directory. Defect KPI’s are listed below:

4.5.1. Weekly Defect Trend:

This figure shows the defects that are opened during the week (blue bar), Closed during

the week (pink bar) and Total open defects (defects that are opened before and that are

not fixed yet) This report will be selected from Bug and Bug_History tables.

Figure 4.20 Weekly Defect Trend

The page below is developed for this report. This page reports all opened, closed and

open defects starting from April 2010, when defect trend reports are begined.

23

Page 34: TEST REPORTING CENTER

Figure 4.21 Weekly Defects

4.5.2. Defect TurnAraund Duration:

This report shows the time between detection and closing of a defect by Priority. This

report will be selected from Bug table

Figure 4.22 Defect Turnaround Time

24

Page 35: TEST REPORTING CENTER

Defect Turn Aroun Time Page is developed for this report and page can be seen like

this:

Figure 4.23 Defect Turnaround Time

4.5.3. Number Of Defects:

This report Shows us the Defect numbers and Priorities by Departments. This report

will be selected from Bug table

25

Page 36: TEST REPORTING CENTER

Figure 4.24 Number of Defects

Department data is taken from Fixed by Group values of the defects.

Figure 4.25 Number of Defects By Departments

26

Page 37: TEST REPORTING CENTER

4.5.4. Defect TurnAround Duration by Department:

This report Shows us the Priorities and total time between detection and closing of

defects by Departments. This report will be selected from Bug table

Figure 4.26 Defect Turnaround Duration by Department

The page that is developed for this report is given below.

Values are in day ratio.

27

Page 38: TEST REPORTING CENTER

Figure 4.27 Defect Turnaround Duration by Department

4.5.5. Avarage Time For Defect States:

This report Shows the time between detection and fixing, and the time between retest

after fixing and closing of the defects by Priority.

Figure 4.28 Avarage Time For Defect States

28

Page 39: TEST REPORTING CENTER

This report is selected from Bug_All_State_Durations table according to responsible group for state changes that are given below. Responsibities change for regular project defects and Production defects.

Table 4.6 Responsible Groups for Defect Status Changes

Tester Developer  New New Fixed  Open New RejectedNew Open Open FixedRejected Reopen Open RejectedFixed Closed Reopen FixedFixed Reopen Reopen Rejected

As mentioned on Figure 1.2 Defect Life Cycle before in defect life cycle defects go through some status changes and time between these status changes represents the efforts of groups. In Table 4.1 effort owner of project defects according to status changes can be seen. In project defects projects are not deployed to live system yet, so that operation team can not be seen in responsible table.

Table 4.7 Responsible Groups for Defect Status Changes (Production Defects)

Tester Developer Operation  New New Fixed   New  Open New Rejected Tested ReopenNew Open Open Fixed Tested ClosedFixed Reopen Open Rejected Rejected ReopenFixed Tested Reopen Fixed      Reopen Rejected    

In Table 4.2 responsibles for status changes for production defects can be seen. It can be understood that operations teams detects the defects, development team prepares the fix and then test team tests the fix and assignes it to production team in “Tested” status.

According to these status changes the page shown in figure 4.18 is developed.

29

Page 40: TEST REPORTING CENTER

Figure 4.29 Avarage Time For Defect States

4.6. Weekly Report

Weekly Reports are Testers weekly plans for tasks and projects. There is a different

item for all projects and all demans in weekly plan. Project name or Demand id can be

null, in case the work planned is not related to any technoport requests or projects.

30

Page 41: TEST REPORTING CENTER

4.6.1. New Weekly Report

Weekly reports include the items that are given below and shows weekly plan of the

tester:

Job description

Related project or demand

Planned duration

Actual start date

Actual end date

Actual duration

Prevent factors

Weekly Reports page is developed as below:

Figure 4.30 Weekly Reports

Report owner name is selected from quality center db.

Technoport id / project name is selected from Access DB (the DB that is updated from

Technoport)

Report date is selected as previous Monday and next Sunday from system date.

If Technoport Request ID field is selected Request Name, Planned Duration, Start Date

fields are updated from Access DB (The local database of Technoport)

If an item is inserted to weekly defect, the page is refreshed and inserted item can be

seen on top of the screen:

31

Page 42: TEST REPORTING CENTER

Figure 4.31 Weekly Reports – Item inserted

As you can see if an item is inserted successfully a message is shown as “Report item

recorded succesfully”

Figure 4.32 Succesful Message

In this page user can also delete or update the item

Figure 4.33 Succesful Message

32

Page 43: TEST REPORTING CENTER

4.6.2. View Weekly Reports

On this page the inserted reports can be seen as a whole. Also the people that did not

entered his/her report is announced.

Figure 4.34 View Weekly Reports

Report date is selected and gridview is filled with all defects and a list of names are

selected from SQL database thet did not planned work week.

4.7. Tasks

Tasks pages are developed for small jobs or supports that do not a Technoport request

but takes time and needs to be reported. In current system testers must record these jobs

and reports them manually, this causes the loss of effort and also misleads the effort

reports.

With the pages below it will be possible to analyse the unseen effort for test support.

4.7.1. Create Tasks

Tasks includes the items below:

Created By

Assigned To

Task Detail

Creation Date

Closing Date

Creation Date is the date when user clicks on “Create Task” button. Closing date is

when user updates the task and Closing date is not empty.

33

Page 44: TEST REPORTING CENTER

Figure 4.35 View Weekly Reports

If user creates the task succesfully the message given below is shown.

Figure 4.36 View Weekly Reports

User can be redirected to “Show Tasks” page by clicking “See All Tasks” hyperlink.

4.7.2. Show Tasks

In this page user can see all the tasks and can filter the task accourding to the fields

given below:

Created By

Assigned To

Status

34

Page 45: TEST REPORTING CENTER

Status field can be “Open” , “Closed” or “All”. Closed status means that tasks closing

date field is not empty

Figure 4.37 View Weekly Reports

User can edit/delete the tasks in this page.

35

Page 46: TEST REPORTING CENTER

5. CONCLUSION

In this section the advantages will be mentioned seperately for each function of the tool.

For weekly reports feature, with this tool;

User is able to create weekly reports easily

Generating all group members weekly report takes only a few seconds.

Finding out the group members who did not create weekly report also takes a

few seconds

Human error is minimized.

Generating reports takes around one hour for the administrator to merge all

reports and publish every week.

For tasks feature, with this tool;

User is not obligated to record start and stop time of small tasks

It is easier to follow rather than email or postit

User effort can be measured.

In current system it is impossible to measure the effort.

For open defects feature, with this tool;

Visitor can see the defects that they are responsible for

Managers can see the defects their groups are responsible for

For hot defects there is no need for anybody to collect the urgent defects and

publish

In current system it is impossible to measure how many person/day effort it

takes for anybody to see all defects they are responsible for.

In current system it takes around half an hour for administrator to collect the hot

defects and report it per day. And usually it is totally forgotten, or some defects

are missing

For ongoing/completed demand and projects feature, with this tool;

It is possible to see the status of a project or demand without looking for it in

quality center

36

Page 47: TEST REPORTING CENTER

There is no need to login and log out many project just to see a few task and

project reports.

In current system it is impossible to measure how many person/day effort it

takes for anybody to see project and demand status reports.

For project and production defects KPI’s feature, with this tool;

It is possible to create defect KPI’s every day (in current system according to

manual effort it is made only once a week)

Human error is minimized.

In current system it takes around 3 hours for administrator to prepare the report

and publish.

In conclusion this application saves 3man/day effort in a month only for measurable

efforts. It prevents human errors and will save an unknown effort of searching the

demands and the test cases and defects that are related to that demand.

Due to fact that Technoport DB connection made through local access DB and QC DB

connection is made through stored procedures it will be possible to implement this tool

to any task management tool or test management tool in the future. It will be enough to

create stored procedures that will create the same bug and test case tables and an Access

DB that will be updated from possinble Task management tool.

5.1. Possible Developments

Auto mail system can be added to the application, that will send mail if ;

Weekly report is not created

A task is assigned to person

A task is closed

To do list can be added to login page for;

Weekly report is not created

A task is assigned to person

37

Page 48: TEST REPORTING CENTER

A task is closed

User authentication data can be taken from Technoport DB or QC DB to make it easier

for the user to remember the username and password.

38

Page 49: TEST REPORTING CENTER

REFERENCES

[1] http://www.microsoft.com/visualstudio/en-us/products/2008-editions/professional

[10.11.2010 last visit]

[2] HP Business Process Testing 9.2 User's Guide.pdf [20.12.2010 last visit]

[3] The Art of Software Testing Gelnford J. Myers [13.06.2011 last visit]

[4] http://en.wikipedia.org/wiki/Software_test [13.06.2011 last visit]

39

Page 50: TEST REPORTING CENTER

CURRICULUM VITAE

Name :Emrah ilerigelen

Birth Date :02.01.1983

Birth Place :Akhisar/Manisa

Highschool :Akhisar Anatolian Highschool

Training Company :Vestel AŞ.

Working Company : Avea İletişim A.Ş.

40