Testing Plan

Document Sample
Testing Plan Powered By Docstoc
					Testing Plan
The Goal of Test Planning

 The IEEE Standard 829-1998:
     To prescribe the scope, approach,
      resources, and schedule of the testing
      activities. To identify the item being tested,
      the features to be tested, the testing tasks
      to be performed, the personnel responsible
      for each task, and the risk associated with
      the plan.
The Goal of Test Planning

 To establish the list of tasks which, if
  performed, will identify all of the
  requirements that have not been met in
  the software. The main work product is
  the test plan.
Test Methodology

 Test Plan
 Test Design
 Test Implementation
 Results Analysis and Documentation
Test Plan

 The test plan is a by-product of the
 detailed planning process that’s
 undertaken to create it. It’s the planning
 process that matters, not the resulting
  (..cont) Test Plan

 The test plan documents the overall approach
  to the test. In many ways, the test plan serves
  as a summary of the test activities that will be
 It shows how the tests will be organized, and
  outlines all of the tester needs which must be
  met in order to properly carry out the test.
 The test plan should be inspected by members
  of the engineering team and a senior
(..cont) Test Plan

 The initial test plan is abstract, and the
  final test plan is concrete (fuzzy to
 The initial test plan contains high level
  ideas about testing the system without
  getting into the details of exact test
 The most important test cases come
  from the requirement of the system.
Steps for Test Plan

 Plan the test phase
 Define test strategy
 Planning the resource requirement
 Distribute tester assignment
 Plan test schedule
Test Phase

 Based on the proposed development
  model and decide whether unique
  phases, or stages, of testing should be
  performed over the course of the project.
 In code-and-fix model, there’s probably
  only one test phase – test until someone
  yells stop.
 In waterfall and spiral model, there can
  be several test phases from examining
  the product spec to acceptance testing.
(..cont) Test Phase

 Each phase must have criteria defined
  for it that objectively and absolutely
  declares if the phase is over and the
  next one has begun.
   Test Strategy

 Test strategy describes the approach that the
  test team will use to test the software both
  overall and in each phase.
 When you are presented with a product spec:
     you’ll need to decide if it is better to use black-box
      or white-box testing?
     If mix – when will you apply each and which part?
     Manually or use tools? If using tools, develop the
      tools or purchase?
Resource Requirement

 The process of deciding what’s
  necessary to accomplish the testing
 Everything that could possibly be used
  for testing over the course of the project
  needs to be considered.
  (..cont) Resource Requirement

 Example:
     People: How many, what experience, what expertise?
      Should they be full-time, part-time, contract or student?
     Equipment: Computers, test hardware, printers, tools.
     Office and lab space: How big? How will they be
     Software: Word processors, databases, custom tools.
      What will be purchased? What need to be written?
     Outsource companies: Will they be used? What criteria?
Tester Assignments

 Break out the individual tester
 The inter-group responsibilities
  discussed earlier dealt with what
  functional group (management, test,
  programmers, and so on) is responsible
  for what high-level task.
 Identify the testers responsible for each
  area of the software and for each
  testable feature.
 Example of Tester Assignments

Tester    Test Assignments
Aliah     Character formatting: fonts, size, color,
Sarah     Layout: bullets, paragraphs, tabs,
Luis      Configuration and compatibility.

Jolie     UI: usability, appearance, accessibility.
Test Schedule

 The test schedule takes all the
  information presented so far and maps it
  into the overall project schedule.
 Critical stage because a few highly
  desired features that were thought to be
  easy to design and code may turn out to
  be very time consuming to test.
 Gantt Chart (Tools: Microsoft Project).
(..cont) Test Schedule

 An important consideration with test
  planning is that the amount of test work
  typically is not distributed evenly over the
  entire product development cycle.
 Some testing occurs early in the form of
  spec and code reviews, tool
  development, etc, but the number of
  testing tasks and the number of people
  and amount of time spent testing often
  increases over the course of the project.
 Example of Test Schedule

Testing Task          Date

Test Plan Complete    05/03/2008
Test Cases Complete   01/06/2008
Test Pass #1          15/06/2008 – 01/08/2008
Test Pass #2          15/08/2008 – 01/10/2008
Test Pass #3          15/10/2008 – 15/11/2008
Test Design

 Definition by IEEE 829:
  Test design specification “refines the test
  approach (defined in the test plan) and
  identifies the features to be covered by
  the design and its associated tests. It is
  also identifies the test cases and test
  procedures, if any, required to
  accomplish the testing and specifies the
  feature pass/fail criteria”.
(..cont) Test Design

 Based on IEEE 829 standard, test
  design should include:
     Identifiers
     Features to be tested
     Approach
     Test case identification
     Pass/fail criteria

 A unique identifier that can be used to
  reference and locate the test design
 The spec should also reference the
  overall test plan and contain pointers to
  any other plans or specs that it
Features to be tested

 A description of the software feature
  covered by the test design spec.
 Example: “the addition function of
  calculator”, “font size selection and
  display in WordPad”, and “video card
  configuration testing of QuickTime”.
 A description of the general approach that will be
  used to test the features.
 It should expand on the approach, if any, listed in the
  test plan, describe the technique to be used, and
  explain how the results will be verified.
 Example: “A testing tool will be developed to
  sequentially load and save pre-built data files of
  various sizes. The number of data files, sizes, and the
  data they contain will be determined through black-
  box technique and supplemented with white-box
  examples from the programmer. A pass or fail will be
  determined by comparing the saved file bit-for-bit
  against the original using a file compare tool”.
Test case identification

 A high-level description and references
  to the specific test cases that will be
  used to check the feature.
 It should list the selected equivalence
  partitions and provide references to the
  test cases and test procedures used to
  run them.
 (..cont) Test case identification

 Example:

 Check the highest possible value   Test case ID # 15326
 Check the lowest possible value    Test case ID # 15327
(..cont) Test case identification

 It’s important that the actual test case
  values aren’t defined in this section.
 For someone reviewing the test design
  spec for proper test coverage, a
  description of the equivalence partitions
  is much more useful than the specific
  values themselves.
Pass/Fail Criteria

 Describe exactly what constitutes a pass
  and a fail of the tested feature.
 What is acceptable and what is not?
 Simple: a pass is when all the test cases
  run without finding a bug.
 Fuzzy: a failure is when 10% or more of
  the test cases fail.
Test Case Planning – why?

 Organization
   Even a small software may have
    thousands of test cases.
   The test cases have been created by
    several testers over the course of several
    months or years.
   Proper planning will organize them so that
    all the testers and other project team
    members can review and use them
(..cont) Test Case Planning –
 Repeatability
     The same test cases can be used several
      times to look for new bug, and makes sure
      old one get fixed.
     Without test case planning, it would be
      impossible to know what test cases were
      last run and exactly how they were run so
      that we can repeat the exact tests.
(..cont) Test Case Planning –
 Tracking
     How many test cases did you plan to run?
     How many did you run on the last software
     How many passed and how many failed?
(..cont) Test Case Planning –
 Proof of testing (or not testing)
   In high-risk industries, the software test
    team must prove that it did run the tests
    that it planned to run.
   It could actually be illegal, and dangerous,
    to release software in which few test caes
    were skipped.
   Proper test case planning and tracking
    provides a means for proving what was
Test Case - Definition

 IEEE 839:
 Test case specification documents the
 actual values used for input along with
 the anticipated outputs. A test case also
 identifies any constraints on the test
 procedures resulting from use of that
 specific test case.
  Test Case Specs
 IEEE 829 standard lists some important
  information that should be included in the test
  case specs:
     Identifiers
     Test item
     Input specification
     Output specification
     Environmental needs
     Special procedural requirements
     Intercase dependencies
(..cont) Test Case Specs

 Identifiers
    A unique identifier is referenced by the test
     design specs and the test procedure
 Test item
    Describes detailed feature, code module,
     etc that’s being tested.
    Provide references to product specs or
     other design docs from which the test case
     was based.
 (..cont) Test Case Specs
 Input specification
    List of all the inputs or conditions given to the
     software to execute the test case.
    If you are testing a file-based product, it would
     be the name of the file and a description of its
 Output specification
   Results you expect from executing the test
   Did all the contents of the file load as
(..cont) Test Case Specs

 Environmental needs
     Hardware, software, test tools, facilities,
      staff, etc that are necessary to run the test
 Special procedural requirements
     Describe anything unusual that must be
      done to perform the test.
     Example: testing nuclear power plant..?
(..cont) Test Case Specs

 Intercase dependencies
     If the test case depends on another test
      case or might be affected by another, that
      information should be included here.
(..cont) Test Case Specs

 Example: Test case presented in the form of
  matrix or table
  Test Case ID   Printer Mfg   Model       Mode    Option

  WP0001         Canon         BJC-700     B/W     Text
  WP0002         HP            LaserJet IV Color   Auto
  WP00010        HP            LaserJet IV High    Draft
Test Procedure / Implementation

 Execute the test cases.
 IEEE 829 defines the test procedure
  specification “ identifies all the steps
  required to operate the system and
  exercise the specified test cases in order
  to implement the associated test design.”
Test Procedures

 How to perform the test cases.
 Information need to be defined:
     Identifiers
     Purpose
     Special requirement
     Procedure steps
Test Procedure

 Identifiers
    A unique identifier that ties the test procedure
     to the associated test cases and test design.
 Purpose
   The purpose of the procedure and reference
    to the test cases that it will execute.
 Special requirements
   Other procedures, special testing skills, or
    special equipment needed to run the
 Test Procedure
 Procedure steps
     Detailed description of how the tests are to
      be run:
        Log – Tells how and by what method the results
         and observations will be recorded.
        Setup – How to prepare for the test.

        Start – Steps used to start the test.

        Procedure – Steps to run the test.

        Measure – How the results are to be
(..cont) Procedure Steps

     Shut  down – Steps for suspending the test for
      unexpected reasons.
     Restart – How to pick up the test at certain
      point if there’s failure or after shutting down.
     Stop – Steps for an orderly halt to the test.

     Wrap up – How to restore the environment to
      its pre-test condition.
     Contingencies – What to do if things don’t go
      as planned.
   Example of test procedure
Identifier: WinCalcProc98.1872

Purpose: This procedure describe the steps necessary to execute the Addition function test
   cases … through …

Special Requirements: No special requirement (Individual test cases).

Procedure Steps:
   Log: The tester will use WordPad with the Testlog template …
   Setup: The tester must install a clean copy of …
   Start: 1) Boot up Windows 98
          2) Click the Start button
          3) …
   Procedure: For each test case identified above, enter the test input data using the
   keyboard …
   Measure: …
   Result Analysis and
 Fundamental principles for reporting a bug:
    Report bugs as soon as possible
    Effectively describe the bugs
        Minimal,   singular, obvious and general etc.
     Be non-judgmental in reporting bugs
        professionally– written against the product, not the
        person, state only facts.
     Follow up on your bug reports
        Make    sure it reported properly, and given attention
         that it needs to be addressed.
        Get them fixed.
Not all bugs are created
 A bug that corrupts a user’s data is more
  severe than one that’s a simple
 But, what if the data corruption can
  occur only in a very rare instance that no
  user is ever likely to see it and the
  misspelling caused every user to have
  problems installing the software?
 Which one is more important to fix?
How to classify bugs?

 General concept used:
     Severity
        Indicateshow bad the bug is; the likelihood
         and the degree of impact when the user
         encounters the bug.
     Priority
        Indicateshow much emphasis should be
         placed on fixing the bug and the urgency of
         making the fix.
1.   System crash, data loss, data corruption, security breach.
2.   Operational error, wrong results, loss of functionality.
3.   Minor problem, misspelling, UI layout, rare occurrence.
4.   Suggestion

1.     Immediate fix, blocks further testing, very visible.
2.     Must fix before the product is released.
3.     Should fix when time permits.
4.     Would like to fix but the product can be release as it is.

Data corruption: Severity 1, Priority 3.
Misspelling: Severity 3, Priority 2.