Advanced Software Engineering: Software Testing • Cover the basic parts of software testing 1. Introduction

  • View
    0

  • Download
    0

Embed Size (px)

Text of Advanced Software Engineering: Software Testing • Cover the basic parts of software testing 1....

  • Advanced Software Engineering: Software Testing COMP 3705 (Lecture1)

    Sada Narayanappa

    Anneliese Andrews (Chair DU) Thomas Thelin Carina Andersson

  • 2

    Lectures

    • Theory + discussions • Cover the basic parts of software testing

    1. Introduction 2. Black-box, Reliability, Usability 3. Inspections, white-box testing 4. Lifecycle, documentation 5. Organization, tools 6. Metrics, TMM 7. Research presentation

    Technical

    Overview

    Technical / Manager

    Managerial

    E c o n o m i c

  • 3

    Examination

    •Written exam based on the book and lab sessions •Lab sessions (approved) •Project/presentations are grades •See class web site for Assignment policy

  • 4

    This week

    •Read course program •Project

    • Read Projects in Software Testing/Pick research • Decide Research/subject • Discuss papers with me – describe why is it interesting

    •Lab • Prepare lab 1

    •Read Burnstein 1-3 • Prepare Burnstein 4,12

  • 5

    Schedule

    •Read • Course program • Projects in Software Testing

    •Check homepage •If needed we will have

    • Extra Lab dates • Ask TA with Lab work

  • 6

    Facts about testing

    System development: •1/3 planning •1/6 coding •1/4 component test •1/4 system test [Brooks75]

    Planning

    Coding

    Component test

    System test

  • 7

    Why use testing?

    •Risk mitigation •Faults are found early •Faults can be prevented •Reduce lead-time •Deliverables can be reused •…

  • 8

    Why do faults occur in software? •Software is written by humans

    • Who know something, but not everything • Who have skills, but aren’t perfect • Who don’t usually use rigorous methods • Who do make mistakes (errors)

    •Under increasing pressure to deliver to strict deadlines • No time to check, assumptions may be wrong • Systems may be incomplete

    •Software is complex, abstract and invisible • Hard to understand • Hard to see if it is complete or working correctly • No one person can fully understand large systems • Numerous external interfaces and dependencies

  • 9

    TMM – Test Maturity Model

    •Stages or levels to move from • Chaotic, unstructured testing model TO • Systematic, process driven approach

    •Software is an engineering discipline •Requires

    • Quality in Process • Quality in Product Are important

  • 10

    TMM Test process Evaluation

    •Guided by Capability Maturity model (CMM) •Stages or levels to evolve from and to •Each level, except Level 1, has structure •Level 2 and above have:

    • Structure • Goals • Organization structure

  • 11

    Internal structure of TMM

    Level

    Testing Capability

    indicate Maturity Goals

    Maturity Sub goals

    Activities/tasks/responsibilities Implementation

    and Organizational adaptation 3 Critical views

    Manager Developer User/Client

    Supported By

    Achieved By

    Organized By

    Addresses

    Contains

  • 12

    Test Maturity Model

  • 13

  • 14

    TMM Levels

    •Level1- No process defined; debug/testing are same •Level2- Test & Debug tools/Test plan/Basic test process •Level3- Test Org/Technical Training/Lifecycle/ Control & Monitor, support V Model •Level4- Review process/test measurement/Software quality Evaluation – Test Logging with severity •Level5-Defect prevention/Quality/Process optimization

  • 15

    Good enough quality

    To claim that any given thing is good enough is to agree with all of the following propositions:

    • It has sufficient benefits

    • It has no critical problems

    • The benefits sufficiently outweigh the problems

    • In the present situation, and all things considered, further improvement would be more harmful than helpful

    James Bach, IEEE Computer, 30(8):96-98, 1997.

  • 16

    Quality attributes – ISO 9126

  • 17

    Quality attributes

  • 18

    Origins of defects

    Defect sources

    Lack of education Poor communication Oversight Transcription Immature process Impact of software

    artifactsErrors

    Faults / Defects

    Failures Impact from user’s view

    Poor quality software

    User dissatisfaction

    Fault model

  • 19

    Whoops, that’s my calculator

  • 20

    Testing, Verification & Validation

    Definition 1 •Verification

    • is the product right?

    •Validation • is it the right product?

    Definition 2 •Verification

    • satisfies the conditions at the start of the phase

    •Validation • satisfies the

    requirements

    Testing The process of evaluating a program or a system

  • 21

    Definitions

    • Failure is an event, fault is a state of the software caused by an error

    • Error – human mistake • Fault / Defect – anomaly in the software • Failure – inability to perform its required

    functions • Debugging / Fault localization –

    localizing, repairing, retesting.

  • 22

    • A TEST CASE consists of: • A set of inputs • Execution conditions • Expected outputs

    • A Test is: • Group of related test cases • Procedures needed to carry out the test case

    Definitions

    IEEE Definition Organization may define

    additional attributes

  • 23

    Scripted and non-scripted testing

    •In scripted testing test cases are pre-documented in detailed, step-by-step descriptions

    • Different levels of scripting possible

    • Scripts can be manual or automated

    •Non-scripted testing is usually manual testing without detailed test case descriptions

    • Can be disciplined, planned, and well documented exploratory testing

    • or ad-hoc testing

  • 24

    Test oracle

    •An oracle is the principle or mechanism by which you recognize a problem •Test oracle provides the expected result for a test, for example

    • Specification document • Formula • Computer program • Person

    •In many cases it is very hard to find an oracle • Even the customer and end user might not be able to

    tell which is the correct behaviour

  • 25

    Test Bed

    •Environment contains • all the hardware and software to test software

    component/system •Examples:

    • Simulators • Emulators • Memory checkers

  • 26

    Other Definitions

    •Important to understand the following definitions

    • Quality– degree of meeting specified requirement

    • Metric – quantitative measure • Quality metric

    •Correctness – perform the function •Reliability –perform under stated condition •Usability – effort to use the system • Integrity – withstand attacks •Portability/maintainability/interoperability …

  • 27

    Principle 1 – purpose of testing

    Testing is the process of: • exercising a software component using

    a selected set of test cases, with the intent of

    1. Revealing defects

    2. Evaluating quality

  • 28

    Principles

    2: A good test case – When the test objective is to detect defects, then a good test case is one that has high probability of revealing a yet undetected defect(s)

    3: Test result – The results should be inspected meticulously

    4: Expected output – A test case must contain the expected output

  • 29

    Principles

    5: Input – Test cases should be developed for both valid and invalid input conditions

    6: Fault content estimation – The probability of the existence of additional defects in a software component is proportional to the number of defects already detected in that component

    7: Test organization – Testing should be carried out by a group that is independent of the development group

  • 30

    Principles

    8: Repeatable – Tests must be repeatable and reusable

    9: Planned – Testing should be planned 10: Life cycle – Testing activities should be

    integrated into the software life cycle 11: Creative – Testing is a creative and

    challenging task

  • 31

    Goals of the course

    A test specialist - trained engineer- have knowledge of test-related •Principles/processes/measu rements, standards, plans, tools, and methods, and •learn how to apply - testing tasks to be performed.

    • Knowledg e

    • Skills

    • Attitudes

  • 32

    www.swebok.org

  • 33

    www.swebok.org

  • 34

    Defect classes and Defect repository

    Functional Description Feature Feature interaction Interface description

    zRequirement/Specification Defect Classes

    Algorithmic and processing Control, Logic, and sequence Data Module interface description External interface description

    zDesign Defect Cl