of 10 /10
Software Quality Assurance (SQA) Software Quality Assurance Use of analysis to validate artifacts requirements analysis design analysis code analysis and testing Technical/Document reviews Control of changes configuration management

Software Quality Assurance (SQA) - UMass Amherstlaser.cs.umass.edu/courses/cs320.Spring11/documents/12SQA-2.pdf · Software Quality Assurance (SQA) Software Quality Assurance Use

Embed Size (px)

Text of Software Quality Assurance (SQA) - UMass...

  • Software Quality Assurance (SQA)

    Software Quality Assurance Use of analysis to validate artifacts requirements analysis design analysis code analysis and testing

    Technical/Document reviews Control of changes configuration management

  • Costs of Poor Quality Increased cost to find and fix problems Increased cost to distribute modifications Increased customer support Product liability Failure in the marketplace Failure of the system

    Terminology correctness reliability testing debugging failure fault/defect

    error verification validation V&V IV&V

  • Terminology correctness--software (artifact) is consistent

    with its specification specification could be wrong or incomplete rarely is the specification correct rarely is software correct

    reliability-- (high) probability that the software conforms to its specification usually a statistical measure based on past

    performance e.g., mean time to failure need to know operational profile But, frequently not known

    Terminology (continued) testing--execute the software on selected

    test cases evaluate the results (test oracle) evaluate performance evaluate ease of use

    testing can demonstrate the presence of bugs but can never prove their absence can rarely rely on exhaustive testing

    how should you select test cases? how do you know when to stop testing?

  • Testing is a sampling technique

    x

    x

    x

    x

    x

    x x

    Terminology (Continued) failure-- an erroneous result produces incorrect outputs fails to meet a real-time constraint

    fault--the cause of one or more failures error -- incorrect concept e.g., design error, logic error

    debugging--the process of finding the cause of a failure and finding a way to fix the associated faults (without introducing additional faults)

  • Terminology (Continued) verification -- the process of proving, using

    mathematical reasoning, that a program is consistent with its specification

    validation -- the processes associated with demonstrating that the software performs reasonably well

    V&V -- verification and validation

    IV&V - independent V&V

    IV&V Psychologically difficult for programmers to test their

    own code thoroughly Inherently want to believe that it works and try to

    demonstrate this If the programmers had thought of all the devious cases

    that could arise, then these would be reflected in the program

    Want someone who is motivated to find problems Want someone who has fresh insight into how to

    exercise the system But, QA is often under pressure not to find problems

    so that a product can be released on time

  • Different kinds of testing unit testing-- test individual components use test stubs and drivers

    integration testing--combine components and test them big bang testing incremental integration build up to subsystems

    system testing--test whole system acceptance testing--testing to determine if

    the product will be accepted

    Different kinds of testing (continued) regression testing--retesting after some

    aspect of the system has been changed determine "old" test cases that must be

    re-executed determine what new test cases are required

    play and capture technology for UI centric systems

  • Different kinds of testing (continued) black box or functional testing testing based on specifications

    white box or structural testing testing based on looking at the artifact

    need to do both black box and white box testing

    Testing is hard work Typically 50% of software development effort

    goes to testing Up to 85% for life critical software Objectives Expose errors Good test case is one with a high probability of

    finding a new failure Successful test case is one that finds a new type of

    failure

  • Exhaustive testing Exhaustive testing requires testing all

    combination of input values Sorting an array of size 10 containing

    one each of 1..10 has 10! input combinations (3,628,800 cases)

    Testing usually sparsely samples the input space

    x

    x

    x

    x

    x

    x x

  • Testing can: Uncover failures Show specifications are met for specific test cases Be an indication of overall reliability

    cannot: Prove that a program is fault-free or correct

    Testing Principles 80% of all errors will likely occur in 20% of the

    modules

    Exhaustive testing is not possible Each test case should be chosen to

    maximize likelihood of finding a failure

    Testing should also be done by someone other than the developers developers do original testing SQA (or IV&V) does independent testing usually black box testing

  • Quality must be an on-going concern Can't build quality into software after the fact Tests should be planned long before testing begins

    requirements --> test plans --> functional test cases

    design --> enhanced test plans --> functional and structural test cases

    code --> enhanced structural (and functional) test cases

    maintenance --> enhanced functional and structural test cases

    Debugging Find the cause of a failure and fixes it Debugging is difficult because Symptom may appear long after the fault occurs May be difficult to reproduce the symptom Symptom may be intermittent, indicating a timing or

    load dependent problem Unit testing (and incremental integration

    testing) simplifies debugging during development because it helps localize faults