23
SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen 1 SW Testing at a Glance – or Two Jari Tahvanainen Testing Specialist Nokia/Maemo Devices

SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen1

SW Testing at a Glance – or TwoJari Tahvanainen

Testing Specialist

Nokia/Maemo Devices

Page 2: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen2

Agenda

• "Bug hunting" game - 15 min

• Foreword – 15 min

• Basics – 45 min• Testing and Standards• Testing Model• Test Strategy• Test Levels• Test Types• Static testing• Dynamic testing

• Test “Domains” - 15 min• Development Testing• System and Acceptance testing

Page 3: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen3

Let’s start …

• Bug hunting – Pair exercise

• Introduce two big (2x2 squares) and 6 small (1x1 squares) bugs to your grid

• Start shooting your opponent bugs

• Time reserved – 15 minutes

Page 4: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen4

World is changing …

• Internet Applications - Open Source Development

• Agile Development – Driving development with Tests

• Business Driven Test Management

• Testing – more with less, faster and better

Page 5: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen5

…, but some things remain

• Some characteristics of good testing [ISTQB Foundations]

• For every development activity there is corresponding testing activity

• Each test level has test objectives specific to that level

• The analysis and design of tests for a given test level should begin during the corresponding development activity

• Testers should be involved in reviewing documents as soon as draft are available in the development life cycle

Page 6: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen6

Page 7: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen7

Testing and Standards

• Quality Assurance - you shall test • ISO 9000:2001

• Testing Standards - you shall test like this• BS-7925 – 1) vocabulary of terms used in software testing, 2) standard for software component

testing• IEEE 829 - Requirements (=structure) for phase test planning documentation (project,

component, integration, system, etc.)• IEEE 1028 - Software review Techniques• IEEE 1012 - Standard for Software Verification and Validation

• Related standards • IEEE 1044 – a standard for the classification of software anomalies• IEEE 12207 - Standard for Information Technology - Software Life Cycle Processes • ISO/IEC 9126 – Software quality model comprised of six quality characteristics: functionality,

usability, reliability, efficiency, maintainability and portability.• ISO/IEC 9241-11 defining usability quality characteristics frame in more details than ISO/IEC 9126

• And others …

Page 8: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen8

Generic Testing Model

• Generic testing model according [ISTQB Practitioner] includes several areas which one should consider:

• Test Terminology: A common set of test terminology ensures efficient communication between all parties concerned with software testing. [ISTQB Practitioner]

• Test Policy: A high level document describing the principles, approach and major objectives of the organization regarding testing. [ISTQB Glossary]

• Test Strategy: A high-level description of the test levels to be performed and the testing within those levels for an organization or programme (one or more projects). [ISTQB Glossary]

• Project Test Plan = Master Test Plan: A test plan that typically addresses multiple test levels. [ISTQB Glossary]• Level Test Plan: A test plan that typically addresses one test level [ISTQB Glossary].

• Test Plan: A document describing the scope, approach, resources and schedule of intended test activities. It identifies amongst others test items, the features to be tested, the testing tasks, who will do each task, degree of tester independence, the test environment, the test design techniques and entry and exit criteria to be used, and the rationale for their choice, and any risks requiring contingency planning. It is a record of the test planning process. [IEEE 829]

• Phase Test Plan: A test plan that typically addresses one test phase. [ISTQB Glossary]

• Test Process: The fundamental test process comprises test planning and control, test analysis and design, test implementation and execution, evaluating exit criteria and reporting, and test closure activities. [ISTQB Glossary]

• Incident management: The process of recognizing, investigating, taking action and disposing of incidents. It involves logging incidents, classifying them and identifying the impact. [ISTQB Glossary].

• Test Documentation: Documentation for test planning, test specification and test reporting.

• Test Process Improvement (TPI): A continuous framework for test process improvement that describes the key elements of an effective test process, especially targeted at system testing and acceptance testing. [ISTQB Glossary]

Page 9: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen9

Test Process

• The Structured Test Process (TMap) consists of:

• Master test plan, managing the total test process

• Acceptance and system tests defined aiming to provide quality assessment and risk recommendations of the product

• Development tests defined to provide quality assessment of the code

• Supporting processes (e.g. Requirement Management, Issue Management, Release Management, Configuration Management)

Page 10: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen10

Test Strategy

• According ISTQB Glossary Test Strategy is “A high-level description of the test levels to be performed and the testing within those levels for an organization or programme (one or more projects).”

• During test planning one should document program/project test strategy which typically includes

• the decisions made that follow based on the (test) project’s goal and • the risk assessment carried out, • starting points regarding the test process, • the test design techniques to be applied, • exit criteria and • test types to be performed.

Page 11: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen11

Test (Design) Techniques

• Test design techniques (procedure used to derive and/or select test cases) helps to improve test efficiency by

• Improving the test cases’ ability to find defects compared to random selection or guesswork, and• Reducing the number of test cases

• Techniques can be grouped e.g. following way:• Specification based or behaviour-based or black-box: Equivalence partitioning (EP), Boundary

value analysis ( BVA), Decision tables, State transition testing, Classification tree method, Pairwise testing (a.k.a. All-Pairs Testing), Use Cases

• Models, either formal or informal, are used for specification of the system• From these models test cases can be derived systematically

• Structure based or white-box or code based test techniques• Information about how the software is constructed is used to derive the test cases• Code Coverage tools (e.g. gcov) can be used to capture the coverage achieved by the tests

• Defect based• The type of defect sought is used as the basis for test design

• Experience based: Error guessing, Checklist-based testing, Exploratory testing, Attacks• Utilize testers’ skills and intuition, along with their experience with similar applications and technologies

• Because no single test technique can find all bugs, several techniques are typically used together

Page 12: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen12

Test levels (and “engineering” domains)

Validation & Verification

Product & Technology

management

Design & Engineering

Construction

Component testing

(unit + module testing)

System testing

Acceptance testing

Develo

pm

ent T

estin

g

Problem Domain

Solution Domain

Functional Domain

Technical Domain

Page 13: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen13

Testing and Quality (Test Types)

• Requirement-based testing - An approach to testing in which test cases are designed based on test objectives and test conditions derived from requirements, e.g. tests that exercise specific functions or probe nonfunctional attributes such as reliability or usability.

• Non-Functional testing - non-functional quality characteristic testing such as reliability and usability and efficiency (ISO/IEC 9126)

• Efficiency• Performance Testing – the ability of a system to respond to user or system inputs within a specified

time and under specified conditions• Load Testing – the ability of a system to handle increasing levels of anticipated realistic loads

resulting from the transaction requests generated by number of parallel users• Stress Testing – the ability of a system to handle peak loads at or beyond maximum capacity

• Reliability• Reliability testing - monitor a statistical measure of software maturity over time and compare this

to a desired reliability goal (Mean Team Between Failures, Mean Time to Repair, etc.)• Robustness Testing – tests for fault tolerance• Recoverability Testing – system’s ability to recover from hardware or software failures

• Usability• Maintainability• Portability

Page 14: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen14

Static Testing

• Testing of a component or system at specification or implementation level without execution of that software, e.g. reviews or static code analysis.

• Review• All types of documentation can be subjected to a review, e.g. requirement specifications, user

stories, concepts, test plans, test asset, etc.• Requirements / user story review• Design review (e.g. technical reviews or inspections)• Acceptance review (e.g. management approval for a system milestone)

• Code static analysis• Reviews (Walk-trough, review, inspection)

• Review done by senior developer• Review by the fellow developer

• Static analysis is one important mean for Defect and/or Failure Prevention:• Coding convention checks• Locating reliability items (e.g. causing system or application to panic)• Locating security items (e.g. caused by buffer overflows, etc.)

• Code metrics provided by Static analysis tools can also contribute to a higher test level as data for risk analysis affecting to maintainability, reliability and security of code.

Page 15: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen15

Dynamic Testing

• Dynamic Testing• Testing that involves the execution of the software of a component or system. It may applied to

detect• Functional compliance (verification and validation)• Memory (resource) leaks• Incorrect use of pointers and other corruption (e.g. of the system stack)• Performance bottlenecks

• The main thing is to analyze different aspect of the component and/or system while it is running• Resource usage (e.g. with debugging and profiling programs)• Performance (e.g. with throughput and response time measurement tools)• Wakeup / use-time (different tools having timing capabilities)• Reliability (e.g. tools for endurance and stress testing)• Security (e.g. fuzzers, etc.)• Test automation tools

• Code Coverage• An analysis method that determines which parts of the software have been executed (covered)

by the test suite and which parts have not been executed, e.g. statement coverage, decision coverage or condition coverage.

Page 16: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen16

Test levels (and domains)

Validation & Verification

Product & Technology

management

Design & Engineering

Construction

Component testing

(unit + module testing)

System testing

Acceptance testing

Develo

pm

ent T

estin

g

Problem Domain

Solution Domain

Functional Domain

Technical Domain

Page 17: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen17

Development Testing

• Development Testing (a.k.a. Developer Testing) by definition is [ISTQB Glossary] "Formal or informal testing conducted during the implementation of a component or system, usually in the development environment by developers". Depending the competences of the developer one can include to this things from component testing (unit, module and component integration testing put together) to feature tests (giving confirmation for the user story) (Test Levels).

• Unit tests are so named because they each test one unit of code. These type of tests are usually written by developers as they work on code (white-box style), to ensure that the specific function is working as expected. Whether a module of code has hundreds of unit tests or only five is irrelevant. One function might have multiple tests, to catch corner cases or other branches in the code. A test suite of unit tests should never cross process boundaries in a program, let alone network connections. Doing so introduces delays that make tests run slowly and discourage developers from running the whole suite.

• Module test tests a module of code similar way than unit tests - the difference is that test are generated black-box style (without reference to the internal structure of the module).

• (Component) Integration test - Introducing dependencies on external modules or data also turns unit tests into integration tests. If one module misbehaves in a chain of interrelated modules, it is not so immediately clear where to look for the cause of the failure. Tests are checking ready made part of the developer's product (implemented tasks and user stories) with simulators (stubs, test drivers) for missing external dependencies.

• Feature test is checking whole developer's product (features and user stories done) by pretending to be a user.

Page 18: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen18

Driving Development with Tests

• Test Driven Development (TDD)• Is a development practice in which test cases are implemented before the actual program code. • Test cases are implemented by developers, usually with some test framework. • The implemented unit tests answer to the question: "Does the code function as how the

programmer intended it to work?" • TDD is implemented in code level - each test is verifying one small unit of code in isolation • Cycle: Write the test, make it green, make it clean!

• Acceptance Test Driven Development (ATDD)• Acceptance Test Driven Development is testing oriented development practice in which

"acceptance" test cases are implemented before the implementation of the User Story starts • The main goal of the tests is to answer to the question: "Does the system do what the Customer wants?" • Writing acceptance tests make the team to use the application and see it from the end-users perspective.

That helps the team • find any peculiar work flows • understand how the end-users sees the application (end-user experience)

• Cycle: • Discuss: Understand what Business Stakeholder needs from any particular feature• Distill: Collaborate with Business Stakeholders to distill the needs into a set of acceptance tests• Develop: Write code to implement the requested features using TDD • Demonstrate: Show the Business Stakeholder the new feature and request feedback

Page 19: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen19

System Testing and Acceptance Testing

• Business Driven Test Management (BDTM)• The TMap life cycle model is used in the description of the test process.• Keep an eye on the final objective - to provide a quality assessment and risk recommendation about the

system• Choices must be made in what is tested and how thoroughly. Such choices depend on the risks that an

organisation thinks it will incur, the available quantities of time and money, and the result the organisation wishes to achieve. The fact that the choices are based on risks, result, time and cost is called business-driven and constitutes the basis for the BDTM approach.

• The client to control the test process and (help) determine the test approach. This gives the testing an economic character.

• Risk Based Testing (RBT)• Product Risk Analysis (PRA) is analyzing the product to be tested with the aim of achieving a shared view,

among the test manager and other stakeholders, of the more or less risky characteristics and components of the product to be tested so that the thoroughness of testing can be agreed upon. The focus in PRA is on the product risks, i.e. what is the risk to the organization if the product does not have the expected quality? The result of the PRA constitutes the basis for the subsequent decisions in strategy as to light, thorough or non testing of a characteristic (e.g. a quality characteristic) or object part (component) of the product to be tested.

• Feature Risk Analysis (FRA) aims at mitigating critical business risks and finding the critical defects early. It is executed by the scrum team - both business view and technical view are combined. The result of the FRA is a list of test goals for the sprint - with an initial assessment of the intensity of the testing needed to mitigate the business risk involved. This initial assessment can be used later in creating the detailed test approach.

Page 20: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen20

System Testing and Acceptance Testing

• Functionality (testing)• Functional (accuracy) testing: Testing the application's or feature's or user story's adherence to the specified

or implied requirements. • Functional (suitability) testing: Evaluating and validating the appropriateness of a set of features for its

intended specified tasks. This testing can be based on use cases or procedures. Testing shall also evaluate the software's tolerance to faults in terms of handling unexpected input values, data, messages or triggers (so-called negative tests).

• SW Interoperability Testing: Tests whether a given group components and applications can function correctly in intended target configuration (incl. hardware, adaptation, operating system, middleware, etc.).

• Efficiency Testing• Performance Testing – the ability of a system to respond to user or system inputs within a specified time and

under specified conditions• Load Testing – the ability of a system to handle increasing levels of anticipated realistic loads resulting from

the transaction requests generated by number of parallel users• Stress Testing – the ability of a system to handle peak loads at or beyond maximum capacity

• Reliability testing• Monitor a statistical measure of software maturity over time and compare this to a desired reliability goal (Mean

Team Between Failures, Mean Time to Repair, etc.)• Robustness Testing – tests for fault tolerance• Recoverability Testing – system’s ability to recover from hardware or software failures

Page 21: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen21

References

• www.deltaaxiom.com

• www.testingexperience.com

• http://www.istqb.org/downloads/syllabi/SyllabusFoundation.pdf

• http://www.istqb.org/downloads/syllabi/CTAL_Syllabus_V_2007.pdf

• http://eng.tmap.net/Home/TMap/The_4_essentials/index.jsp

• http://en.wikipedia.org/wiki/ISO_9126

• http://www.ambysoft.com/essays/floot.html

• Terminology• http://www.istqb.org/downloads/glossary-current.pdf

• http://eng.tmap.net/Home/TMap/Glossary.jsp

• http://en.wikipedia.org/wiki/Software_testing

Page 22: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen22

Testers’ skill levels

• Specialist = Innovator = skilled on more than one area + provides wide area knowledge

• test specialist (see test consultant + knowledge about business + networking)• Expert = Knowledge sharer = skilled on testing area + provides knowledge + has

interpersonal skills• test consultant (communication skills, test process knowledge, scheduling and estimating skills,

listening skills, system/application skills, problem solving skills)• test manager (task and people skills, promote/market testing, good tester, management skills,

problem solving skills, risk assessment)• test analyst (analyse/challenge requirements - produce good quality, effective and efficient test cases,

produce expected results, testing skills)• Engineer = Practitioner (Senior = competent practitioner through experience or

having several role possibilities)• non-functional test (technical skills, system/application skills, investigation skills, analysis skills)• test environment (technical skills, communication skills, investigation skills, analysis skills)• test executor (prioritisation of tests, test logs, monitor progress, analysis skills)• test designer (black box, white box, communication skills)• test automation (knowledge of tools and support, communication skills)• test statistician = technical test analyst (analysis skills, metrics skills, monitoring skills, influencing

skills)• Learner

• is a learner

Page 23: SW Testing at a Glance – or Twousers.jyu.fi/~samiayr/testaus2010/SW_Testing_at_a_Glance.pdf · 2010-02-11 · 5 SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen …, but

SW_Testing_at_a_Glance.ppt / 2010-02-02 / Jari Tahvanainen23

Reporting To Different People

Tester

Manager

Test Manager

Customer

DeveloperIncident Reports

Factual,Specific,Detailed

Progress Reports

Factual,Clear,

Detailed,Risk

Director

Progress& Problems

Specific,Fair,

Honest,Trendy

Progress ReportsRisks,Costs,

Generic

Risks,Costs,

Generic,Graphical

When a test produces an unexpected outcome, - further effort is necessary to classify the incident as a software error, a design error, a specification error, a testing error, etc.