Docstoc

Testing

Document Sample
Testing Powered By Docstoc
					         Verification and Validation
 Verification:   checks that the program
    conforms to its specification.
     – Are we building the product right?
 Validation: checks that the program as
    implemented meets the expectations of the
    user.
     – Are we building the right product?



Nov. 18, 2004                                   1
                Static Verification
 Program inspection
 Formal method




Nov. 18, 2004                         2
                Verification and Proofs of
                       Correctness
 Formally    specify the desired functionality,
    then verify that the program is a correct
    implementation of the specification.




Nov. 18, 2004                                      3
                  Hoare's Rules
 Program  fragments and assertions are
    composed into triples {P}S{Q}
     – where P is the precondition assertion, Q is the
       postcondition assertion,
       and S is program statements.
     – Interpretation: If P is true before S is executed,
       then when S terminates, Q is satisfied.




Nov. 18, 2004                                               4
                Proofs of Correctness
 Partial correctness: if the precondition is
  true, and the program terminates, then the
  postcondition is satisfied.
 Total correctness: is partial correctness, plus
  a proof of termination.




Nov. 18, 2004                                   5
                The Assignment Rule
                {P} x:=f {Q}
 where P and Q are the same, except that all
  the occurrences of x in P have been replaced
  by f.
 Backward substitution
 {x=5} x = x + 1 { x = 6}
 {z > y + 50} x = z – 43 { x > y + 7}



Nov. 18, 2004                                6
      Rule for Sequencing Statements
              {F1} S1; S2{F3}


                {F1}S1{F2}, {F2}S2{F3}
                    {F1}S1;S2{F3}




Nov. 18, 2004                            7
        Rule for Conditions and Loops

           {P & C} S1 {Q}, {P & ~C} S2 {Q}
            {P} if C then S1 else S2 endif {Q}

                       {I & C} S { I }
                {I} while C do S od {I & ~C}




Nov. 18, 2004                                    8
                    Software Testing
 Software           Requirements Specifications
     – Describes the expected runtime behaviors of the
       software.
A       Test Plan
     – Describe how to test each behavior.
 The           software (source code or executable)



Nov. 18, 2004                                          9
                      Testing
 Failure:
     – The departure of program operation from user
       requirements.
    Fault:
     – A defect in a program that may cause a failure.
    Error:
     – Human action that results in software
       containing a fault.

Nov. 18, 2004                                         10
                The Test Plan
 A ``living document''. It is born with the system
  and evolves as the system evolves. It is what the
  key decision makers use to evaluate the system.
 User objectives.
 System Description and traceability Matrices.
 Special Risk Elements.
 Required Characteristics-- operational and
  technical.
 Critical Test Issues -- operational and technical.




Nov. 18, 2004                                          11
                Management Plan
 IntegratedSchedule
 Roles and Responsibilities
 Resources and Sharing




Nov. 18, 2004                     12
                Verification Outline
 Verification to Date
 Previous Results
 Testing Planned
     –    Unresolved Issues
     –    Issues arising during this phase
     –    Scope of Planned tests
     –    Test Objectives
 Special Resources
 Test Articles




Nov. 18, 2004                                13
                 Validation Outline
 Validation to Date
 Previous Results
 Testing Planned
     –    Unresolved Issues
     –    Issues arising during this phase
     –    Scope of Planned tests
     –    Test Objectives
    Special Resources
    Test Articles



Nov. 18, 2004                                14
      Test Results and Traceability
 Test   Procedures
    Test Reporting
    Development Folders




Nov. 18, 2004                         15
                Types of Faults
 algorithmic    faults
    computation and precision faults
    documentation faults
    stress or overloaded faults
    capacity or boundary faults
    timing or coordination faults
    throughput or performance faults

Nov. 18, 2004                           16
                IBM Orthogonal Defect
                    Classification
   Function: fault that affects capability, end-user interfaces, product
    interface with hardware architecture, or global data structure.
   Interface: fault in interfacing with other components or drives via calls,
    macros, control blocks, or parameter lists.
   Checking: fault in program login that fails to validate data and values
    properly before they are used.
   Assignment: fault in data structure or code block initialization.
   Timing/serialization: fault that involves timing of shared and real-time
    resources.
   Build/package/merge: fault that occurs because of problems in
    repositories, management changes or version control.
   Documentation: fault that affects publications and maintenance notes
   Algorithm: fault involving efficiency or correctness of algorithm
    or data structure but not design.



Nov. 18, 2004                                                               17
                The Testing Process
 Unit testing
 Component testing
 Integration testing
 System testing
 Acceptance testing




Nov. 18, 2004                         18
                Testing Strategies
 Top-down    testing
 Bottom-up testing
 Thread testing
 Stress testing
 Back-to-back testing




Nov. 18, 2004                        19
      Traditional Software Testing Techniques

   Black box testing
    – program specifications : functional testing
    – operational profile: random testing, partition testing


   White box testing
    – statement coverage
    – branch coverage
    – data flow coverage
    – path coverage
   Others
    – Stress testing
    – Back-to back testing

Nov. 18, 2004                                                  20
                Defect Testing
 Black-box   testing
 Interface testing
 Structural testing




Nov. 18, 2004                    21
                Black-Box Testing
 Graph-based testing methods
 Equivalence Partitioning
 Boundary value analysis




Nov. 18, 2004                       22
                Graph-Based Testing
 Transaction  flow modeling
 Finite state modeling
 Data flow modeling
 Timing modeling




Nov. 18, 2004                         23
                 Partition Testing
   If an input condition specifies a range, one valid and two
    invalid
    equivalence classes are defined.
   If an input condition requires a specific value, one valid
    and two
    invalid equivalence classes are defined.
   If an input condition specifies a member of a {\em set},
    one valid and
    one invalid class are defined.
    If an input condition is boolean, one valid and one invalid
    class
    are defined.



Nov. 18, 2004                                                 24
           Boundary Value Analysis
   If an input condition specifies a range bounded by values
    a} and
     b, test cases should be designed with values a and b , just
    above and just below a and b, respectively.
    If an input condition specifies a number of values, test
    cases should
    be developed that exercise the minimum and maximum
    numbers.
    Values just above and below minimum and maximum are
    also tested.
    Apply guidelines 1 and 2 to output conditions.
    If internal program data structures have prescribed
    boundaries,
    be certain to design a test case to exercise the data structure
    at its boundary
Nov. 18, 2004                                                    25
                Interface Testing
 Parameter interfaces
 Shared memory interfaces
 Procedural interfaces
 Message passing interfaces




Nov. 18, 2004                       26
                Integration Testing
 top-down integration
 bottom-up integration
 incremental testing




Nov. 18, 2004                         27
                White-Box Testing
 Statement Coverage Criterion
 Branch coverage criterion
 Data flow coverage criterion
 Path coverage criterion




Nov. 18, 2004                       28
                Statement Coverage
      while (…) {
         ….
           while (…) {
              …..
              break;
            }
        …..
        break;

Nov. 18, 2004                        29
                Branch Coverage
if ( StdRec != null)
        StdRec.name = arg[1];
……
…….

 write (StdRec.name)



Nov. 18, 2004                     30
                           Data Flow
   Data-flow graph: is defined on the control flow graph by
    defining sets of variables DEF(n), C-USE(n) and P-
    USE(n), for each node n.
   Variable x is in DEF(n) if
     – n is the start node and $x$ is a global, parameter, or static local
       variable.
     – x is declared in a basic block n with an
       initializer.
     – x is assigned to the basic block with the =, op=,
       ++, or --
       operator.
Nov. 18, 2004                                                                31
                Data flow Testing
read (x, y);             Path : 2n
if x> 0                  Data flow : n4
    z = 1;
else                      n is the number of
    z = 0;                  conditional statements
 if y < 0                Branch:
    write (z)            Cyclomatic complexity
else                     (McCabe)
    write (y / z);       CC(G) = #E - #N + 2P.



Nov. 18, 2004                                        32
                          C-Use
   Variable x is in C-USE(n) if x occurs as a C-USE
    expression in basic block as
     – a procedure argument,
     –     an initializer in a declaration,
     –    a return value in a return statement,
     –    the second operand of “=“
     –    either operand of “op=“
     –     the operand of ++, --, *
     –    the first operand of . , =>

Nov. 18, 2004                                          33
                        P-Use
 Variable x is in P-USE(n) if x occurs as a P-
    USE expression in basic block
     – as the conditional expression in a if, for, while,
       do, or switch statement.
     – as the first operand of the condition expression
       operator (?:), the logical and operator (&&), or
       the logical or operator (||)




Nov. 18, 2004                                           34
                 C-Use Coverage
   A C-Use is a variable x and the set of all paths in
    the data-flow graph from node na to nb such that
     – x is in DEF(na), and
     – x is not in DEF(ni) for any other node ni on the paths
       (definition clear path), and
     – x is in C-USE(nb).
   A C-Use is covered by a set of tests if at least one
    of the paths in the C-Use is executed when the test
    is run.



Nov. 18, 2004                                                   35
                 P-Use Coverage
   A P-Use is a variable x and the set of all paths in
    the data-flow graph from node na to nb such that
     – x is in DEF(na), and
     – x is not in DEF(ni) for any other node ni on the paths
       (definition clear path), and
     – x is in P-USE(nb).
   A P-Use is covered by a set of tests if at least one
    of the paths in the P-Use is executed when the test
    is run.



Nov. 18, 2004                                                   36
                   Path Coverage
read (x, y);              x = 1, y = 1 -> 1
if x> 0            X>0
                          x = -1, y = -1 -> 0
    z = 1;
                          x = 0, y = 0 -> error
else
    z = 0;
 if y < 0          Y<0

    write (z)
else
    write (y / z);


Nov. 18, 2004                                     37
                    all-paths

                    all-du Paths

                    all-uses

                                   all-p-uses/some-c-uses
 all-c-uses/some-p-uses                  all-p-uses

 all-c-uses                               Branch
                     all-defs
                                         Statement


Nov. 18, 2004                                    38
   Complexity of White-Box Testing
 Branch        Coverage:
     – McCabe’s Cyclomatic complexity:
     CC(G) = #E - #V + 2P (G = (E, V))
 All-defs:M+I*V
 All-p-uses, all-c-uses/some-p-uses,
  all-p-uses/some-c-uses, all-uses: N 2
 All-du, All-Path: 2N: N is the number of
  conditional statements

Nov. 18, 2004                                39
                Mutation Testing
A   program under test is seeded with a single
  error to produce a ``mutant'' program.
 A test covers a mutant if the output of the
  mutant and the program under test differ for
  that test input.
 The mutation coverage measure for a test
  set is the ratio of
  mutants covered to total mutants


Nov. 18, 2004                                40
                       Debugging
                Program Slicing Approach
   Static slicing: decomposes a program by statically
    analyzing data-flow and control flow of the
    program.
    – A static program slice for a given variable at a given
      statement
      contains all the executable statements that could
      influence the value of that variable at the given
      statement.
    – The exact execution path for a given input is a subset of
      the static
      program slice with respect to the output variables at the
      given checkpoint.
    – Focus is an automatic debugging tool based on static
      program slicing to locate bugs.
Nov. 18, 2004                                                 41
                Dynamic Slicing
 dynamic    data slice: A dynamic data
  slice with respect to a given
  expression, location, and test case is a
  set of all assignments whose
  computations have propagated
  into the current value of the given
  expression at the given location.
 dynamic control slice:
  A dynamic control slice with respect to
  a given location and test case is a set of
  all predicates that enclose the given
  location.
Nov. 18, 2004                              42
 Object-Oriented Testing Process
 Unit testing - class
 Integration testing - cluster
 System testing - program




Nov. 18, 2004                     43
                Class Testing
 Inheritance
 Polymorphism
 Sequence




Nov. 18, 2004                   44
                Class Testing Strategies
 Testing inheritance
 Testing polymorphism
 State-oriented testing
 Data Flow testing
 Function dependence class testing




Nov. 18, 2004                              45
                Inheritance
A subclass may re-      Class foo {
 define its inherited      int local var;
 functions and other       ...
 functions may be          int f1() { return 1; }
                           int f2() { return 1/f1(); }
 affected by the re-    }
 defined functions.     Class foo child :: Public foo
When this subclass is        {
 tested, which          // child class of foo
 functions need to      int f1() { return 0; }
 be re-tested?          }


Nov. 18, 2004                                            46
                Testing Inheritance
``Incremental testing of object-oriented class
  structures.'' Harrold et al., (1992)
 New methods: complete testing
 Recursive methods: limited testing
 Redefined methods: reuse test scripts




Nov. 18, 2004                                    47
                Polymorphism
An object may be          // beginning of function
  bound to different         foo {
  classes during the            ...
  run time.                     P1 p;
Is it necessary to test         P2 c;
  all the possible               ...
  bindings?                    return(c.f1()/p1.f1());
                          // end of function foo


Nov. 18, 2004                                       48
                 Testing Polymorphism
 ``Testing the Polymorphic Interactions
    between Classes.''

                McDaniel and McGregor (1994)

                Clemson University




Nov. 18, 2004                                  49
                State-Oriented Testing
   ``The state-based testing of object-oriented
  programs,'' 1992, C. D. Turner and D. J. Robson
 ``On Object State Testing,'' 1993,
    Kung et al.
 ``The testgraph methodology: Automated testing
  of collection classes,'' 1995, Hoffman and
  Strooper
 The FREE approach: Binder
  http://www.rbsc.com/pages/Free.html



Nov. 18, 2004                                       50
                Data Flow Testing
 ``Performing Data Flow Testing on Class,''
  1994, Harrold and Rothermel.
 ``Object-oriented data flow testing,'' 1995,
  Kung et al.




Nov. 18, 2004                                    51
    Function Dependence Relationship
   A function uses a variable means that the value of
    the variable is referenced in a computation
    expression or used to decide a predicate.
    A function defines a variable means that the value
    of the variable is assigned when the function is
    invoked.
    A variable x uses a variable y means the value of
    x is obtained from the value of y and others. x is
    affected when the value of y is changed.



Nov. 18, 2004                                         52
  Function Dependence Relationship
 f1 depends on f2
 f1 uses a variable x that is defined in f2,
 f1 calls f2 and uses the return value of f2,
 f1 is called by f2 and uses a parameter p that
  is defined in f2.
 f1 uses a variable x and x uses a variable y
  which is defined in f2.


Nov. 18, 2004                                  53
                Object-Oriented Pitfalls
 Type          I faults
     – inheritance/polymorphism faults
 Type          II faults
     – object management faults
 Type          III faults
     – traditional faults




Nov. 18, 2004                              54
     Object-Oriented Pitfalls - Type I

Class foo {                     main() {
                                   int x, y;
   int local_var
                                   foo Obj;
   .                               cin >> x, y;
   int f1() {return 1;}             if ( x > 0 )
   int f2() {return 1/f1( );}            Obj = new foo( );
}                                   else
                                         Obj = new foo_derived( );
                                    if ( y > 0 )
class foo_derived{                        cout << Obj.f2( );
    int f1() {return 0;}            else
}                                         cout << Obj.f1( );
                                }

Nov. 18, 2004                                                    55
    Object-Oriented Pitfalls - Type II

                              main() {
class foo {                   L1: foo *obj1 = new foo( );
                              L2: obj1->inc();
   Int *m_data;
                              L3: foo *obj2 = new foo( );
   foo() {                    L4: *obj2 = *obj1;
      m_data = new Int;             if (P1)
                              L5:      obj2->inc( );
     *m_data = 0; }
                              L6: obj2->print( );
    ~foo() {delete m_data;}         if (P2)
    print() {                 L7:      obj2->~foo( );
       cout << *m_data;}            if (P3)
                              L8:      obj1->print( );
    inc() {*m_data++;}        L9: obj1->~foo( );
                              }


Nov. 18, 2004                                               56
      Empirical Study - Applications


   System A : GUI
   System B : Data logging system
   System C : Network communication
               program



Nov. 18, 2004                          57
                Fault Summary
 System         System A System B System C
 LOC              5.6k    21.3k    16.0k
 NOF               35       80       85
 Type I             5       15       10
 Type II            6       13        7
 Type III          24       52       68
 OOF (%)          31%      35%      20%


Nov. 18, 2004                                58
                Functional Testing
 System          System A System B System C
 NOT                100      383     326
 Type I             1(4)    4(11)    5(5)
 Type II            1(5)     6(7)    3(4)
 Type III          15(9)   32(20)   44(24)
 OOF (%)          2(18%) 10(35%) 8(47%)
 NOF             17(48%) 42(52%) 52(61%)


Nov. 18, 2004                                 59
                Statement Testing
 System          System A System B System C
 NOT                 46       55       52
 Type I             0(4)    1(10)     0(5)
 Type II            1(4)     1(6)     1(3)
 Type III           5(4)    6(14)    4(20)
 OOF (%)          1(11%)   2(11%)   1(11%)
 NOF              6(33%)   8(21%)   5(15%)


Nov. 18, 2004                                 60
                Branch Testing

System          System A System B System C
NOT                 23       41       33
Type I             0(4)     1(9)     1(4)
Type II            1(3)     1(5)     0(3)
Type III           1(3)     6(8)    5(15)
OOF (%)          1(11%)   2(11%)   1(11%)
NOF              2(11%)   8(21%)   6(18%)


Nov. 18, 2004                                61
                Code-Based Testing

  System    System A System B System C
  NOT          69       96       85
  OO faults 2(22%)    2(22%)   2(22%)
  NOF        8(44%) 12(42%) 13(33%)




Nov. 18, 2004                        62
                  All-States
 System         System A System B System C
 NOT                21      37       38
 Type I            1(3)    3(7)     2(3)
 Type II           0(5)    2(4)     1(2)
 Type III          4(5)    8(6)    10(10)
 OOF (%)         1(11%)   5(28%)   3(33%)
 NOF             5(28%) 13(34%) 13(40%)


Nov. 18, 2004                            63
                All-Transitions
 System         System A System B System C
 NOT                43      79       75
 Type I            1(2)    2(5)     0(3)
 Type II           2(3)    1(3)     1(1)
 Type III          4(1)    6(0)     8(2)
 OOF (%)         3(33%)   3(17%)   1(11%)
 NOF             7(39%)   9(24%)   9(27%)


Nov. 18, 2004                            64
                State-Based Testing
 System           System A System B System C
 NOT                 64       116      113
 OOF (%)           4(44%)   8(44%)   4(44%)
 NOF              12(66%) 22(58%) 22(67%)




Nov. 18, 2004                              65
         Object-Flow Based Testing
 Object
     – an object is an instance of a class
 Define
     – an object is defined if its state is initiated or
       changed
 Use
     – an object is used if one of its data members is
       referenced

Nov. 18, 2004                                              66
    Object-Flow Coverage Criteria
 All-du-pairs
     – at lease one definition-clear path from every
       definition of every object to every use of that
       definition must be exercised under some test.
 All-bindings
     – Every possible binding of every object must be
       exercised at least once when the object is
       defined or used.


Nov. 18, 2004                                            67
         Weak Object-Flow Testing
 An        object is defined
     – the constructor of the object is invoked;
     – a data member is defined;
     – a method that initiates/modifies the data
       member(s) of the object is invoked




Nov. 18, 2004                                      68
     Object-Oriented Pitfalls - Type I

Class foo {                    main() {
                                  int x, y;
   int local_var
                                  foo Obj;
   .                              cin >> x, y;
   int f1() {return 1;}            if ( x > 0 )
   int f2() {return 1/f1();}            Obj = new foo();
}                                  else
                                        Obj = new foo_derived();
                                   if ( y > 0 )
class foo_derived{                       cout << Obj.f2();
    int f1() {return 0;}           else
}                                        cout << Obj.f1();
                               }

Nov. 18, 2004                                                      69
    Object-Oriented Pitfalls - Type II

                              main() {
class foo {                   L1: foo *obj1 = new foo();
                              L2: obj1->inc();
   Int *m_data;
                              L3: foo *obj2 = new foo();
   foo() {                    L4: *obj2 = *obj1;
      m_data = new Int;             if (P1)
                              L5:      obj2->inc();
     *m_data = 0; }
                                    if (P2)
    ~foo() {delete m_data;}   L6:      obj2->~foo();
    print() {                       if (P3)
       cout << *m_data;}      L7      obj1->print();
                                    obj1->~foo();
    inc() {*m_data++;}        }



Nov. 18, 2004                                              70
                All-DU-Pairs
 System         System A System B System C
 NOT                54      89       65
 Type I            2(2)    6(5)     1(4)
 Type II           4(1)    6(1)     4(0)
 Type III          3(6)    7(13)    3(21)
 OOF (%)         6(67%) 12(67%) 5(56%)
 NOF             9(50%) 19(50%) 8(24%)


Nov. 18, 2004                            71
                All-Bindings
 System         System A System B System C
 NOT                21      37       32
 Type I            1(1)    3(2)     3(1)
 Type II           1(0)    1(0)     0(0)
 Type III          2(4)    3(10)    4(17)
 OOF (%)         2(22%)   4(22%)   3(33%)
 NOF             4(22%)   7(18%)   7(21%)


Nov. 18, 2004                            72
       Object-Flow Based Testing I
 System         System A System B System C
 NOT                75      126      97
 OOF               8(1)    16(2)    8(1)
 NOF              13(5)   26(12)   15(18)




Nov. 18, 2004                                73
      Object-Flow Based Testing I’
 System         System A System B System C
 NOT               115      155     145
 OOF               8(1)    16(2)    8(1)
 NOF              16(2)    31(7)   21(12)




Nov. 18, 2004                                74
      Object-Flow Based Testing II
 System         System A System B System C
 NOT               187      295      247
 OOF (%)           9(0)    18(0)    9(0)
 NOF              17(1)    37(1)    28(5)




Nov. 18, 2004                                75
    Object-Flow Based Testing II’
 System         System A System B System C
 NOT               264      391      332
 OOF               9(0)    18(0)     9(0)
 NOF              17(1)    37(1)    29(4)




Nov. 18, 2004                                76
                Integrated Testing
 Functional testing
 Code-based testing
 Object-flow based testing I
 State-based testing




Nov. 18, 2004                        77
                Integrated Approach
 System           System A System B System C
 NOT                 126      131      167
 OOF                 9(0)    18(0)     9(0)
 NOF                17(1)    36(2)    29(4)




Nov. 18, 2004                                  78

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:0
posted:5/27/2013
language:Unknown
pages:78
tang shuming tang shuming
About