13
SOFTWARE QUALITY ASSURANCE SOFTWARE QUALITY ASSURANCE TEST MANAGEMENT TEST MANAGEMENT Seminar: Oana FEIDI Quality Manager – Continental Automotive

Test management

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Test management

SOFTWARE QUALITY ASSURANCESOFTWARE QUALITY ASSURANCETEST MANAGEMENTTEST MANAGEMENT

Seminar: Oana FEIDIQuality Manager – Continental Automotive

Page 2: Test management

Project teamProject team

Project Manager

Test Manager

SW Project Manager

Quality Manager

Page 3: Test management

Test management - Test management - definitionsdefinitions important part of software quality is the process of testing

and validating the software

Test management is the practice of organizing and controlling the process and artifacts required for the testing effort.

The general goal of test management is to allow teams to plan, develop, execute, and assess all testing activities within the overall software development effort.◦ This includes coordinating efforts of all those involved in

the testing effort, tracking dependencies and relationships among test assets and, most importantly, defining, measuring, and tracking quality goals.

Page 4: Test management

Test management - phasesTest management - phases Test artifacts and resource organization Test planning is the overall set of tasks that address the

questions of why, what, where, and when to test.

Test authoring is a process of capturing the specific steps required to complete a given test.

Test execution consists of running the tests by assembling sequences of test scripts into a suite of tests.

Test reporting is how the various results of the testing effort are analyzed and communicated. This is used to determine the current status of project testing, as well as the overall level of quality of the application or system. Automatisation level

28%31%

36%36%35%35%35%36%36%36%36%34%39%

42%43%42%44%47%48%48%51%

60%60%61%57%

54%54%53%53%56%57%58%

64%68%

70%69%70%70%69%

80%

48%

35%

68%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

CW

13C

W14

CW

15C

W16

CW

17C

W18

CW

19C

W20

CW

21C

W22

CW

23C

W24

CW

25C

W26

CW

27C

W28

CW

29C

W30

CW

31C

W32

CW

33C

W34

CW

35C

W36

CW

37C

W38

CW

39C

W40

CW

41C

W42

CW

43C

W44

CW

45C

W46

CW

47C

W48

CW

49C

W50

CW

51

LS 7 LS 7.1 LS 8 LS 9

Calender weeks / Delivery steps

% o

f a

uto

ma

ted

TC

s

0

500

1000

1500

2000

2500

3000

3500

# o

f s

pe

cif

ied

TC

s

act. # of spec. TCs

atc. autom. level [%]

planned autom. level [%]

Page 5: Test management

Test management – phasesTest management – phases(examples)(examples)

Page 6: Test management

Test management - Test management - challengeschallenges

Why should I test? What should I test? Where do I test? When do I test? How do I conduct the tests?

Not enough time to test Not enough resources to test Testing teams are not always

in one place

Difficulties with requirements Keeping in synch with

development Reporting the right

information

http://www.ibm.com/developerworks/rational/library/06/1107_davis/

Page 7: Test management

Test management – priorities Test management – priorities definitionsdefinitions

Example Priority Definitions

◦ P1 – Failure on this test is likely to result in a loss or corruption of data. This test must be run as soon as practicable and must also be run on the final build.

◦ P2 – Failure on this test is likely to results in unacceptable loss of functionality. This test must be run as soon as practicable. The test should also be run for the final time once development in this area of functionality has stabilized.

◦ P3 – Failure on this test is likely to result in loss of functionality but there may well be workarounds available. This test should be run only once development in this area of functionality has stabilized.

◦ P4 – Failure on this test is likely to result in loss of functionality that is not critical to a user. This test should be run once and probably need not be run again.

◦ P5 – Failure on this test is likely to indicate a trivial problem with the functionality. If time permits it would be nice to run these tests but they need not be completed if the time scales don’t allow (i.e. if this test was carried out and failed it would not stop the product shipping)

Page 8: Test management

Test management – classifications Test management – classifications examples (automotive)examples (automotive)

Renault rating

Continental rating

Page 9: Test management

Test management – specific Test management – specific rulesrules

Test Technique type Example

Systematic Boundary value (~85%)

Lessons Learned Checklist (~5%)

Intuitive Error guessing (~5%)

Supporting Stress test, Robustness test (~5%)

Special critical timing analysis (only if applicable)

Page 10: Test management

Role of test managerRole of test manager

What the test manager is responsible for: ◦ Defining and implementing the role testing plays within the organization. ◦ Defining the scope of testing within the context of each release/delivery. ◦ Deploying and managing the appropriate testing framework to meet the

testing mandate. ◦ Implementing and evolving appropriate measurements and metrics.

To be applied against the product under test. To be applied against the testing team.

◦ Planning, deploying and managing the testing effort for any given engagement/release.

◦ Managing and growing testing assets required for meeting the testing mandate: Team members Testing tools Testing processes

◦ Retaining skilled testing personnel.

Page 11: Test management

Test management Test management recommendationsrecommendations

Start test management activities early Test iteratively Reuse test artifacts Utilize requirements-based testing

◦ Validating that something does what it is supposed to do ◦ Trying to find out what can cause something to break

Defining and enforcing a flexible testing process Coordinate and integrate with the rest of development Communicate status Focus on goals and results

Page 12: Test management

Test management - testing Test management - testing metricsmetrics

Number of faults detected per functionality ordered by severity before delivery

Number of test cases per functionality Number of test steps per test case Number of test cases per requirement Number of faults detected by test cases before delivery Effort for execution of test cases Requirement coverage by test cases

Page 13: Test management

Test management - testing Test management - testing metricsmetrics