Docstoc

Title Subtitle

Document Sample
Title Subtitle Powered By Docstoc
					Automated Generation of Test Suites from
Formal Specifications
Alexander K.Petrenko
Institute for System Programming of Russian
Academy of Sciences (ISP RAS), Moscow.

             February, 2000
Ideal Testing Process
     Why formal specification?

                             Design
Forward engineering:         specifications                           ? Oracles
                                                      ? Criteria
                                       Sources                        Partition   Tests



                                                                      ? Oracles
Reverse engineering:                                  ? Criteria

                              Sources                                 Partition   Tests


       What kind of specifications?
Pre- and post-conditions,        for          Oracles and Partition
invariants

? Algebraic specifications       for          Test sequences

 1
                                   Cambridge, February, 2000
Ideal Testing Process
     Why formal specification?

                             Design                                         Oracles
Forward engineering:         specifications       Criteria
                                       Sources                            Partition    Tests



                                       Post-                                 Oracles
Reverse engineering:                   specifications          Criteria

                              Sources                                     Partition    Tests


       What kind of specifications?
Pre- and post-conditions,        for           Oracles and Partition
invariants

? Algebraic specifications       for           Test sequences

 2
                                   Cambridge, February, 2000
KVEST project history


• Started under contract with Nortel Networks in 1994
     to develop a system automatically generating test
     suites for regression testing from formal
     specifications, reverse engineered from the existing
     code

• A joint collaboration effort between Nortel Networks
     and ISPRAS background:
     — Soviet Space Mission Control Center OS and networks;
     — Soviet space shuttle ―Buran‖ OS and real-time programming
       language;
     — formal specification of the real-time programming language

 3
                            Cambridge, February, 2000
What is KVEST?


• KVEST: Kernel Verification and Specification Technology
• Area of Application:       specification, test generation and test
                                execution for API like OS kernel interface
• Specification Language:       RAISE/RSL (VDM family)
• Specification Style:          state-oriented, implicit (pre- and
                                post-conditions, subtype restrictions)
•   Target Language:            Programming language like C/C++
•   Size of Application:        over 600Kline
•   Size of specification:      over 100Kline
•   Size of test suites:        over 2Mline
•   Results:                    over hundred errors have been detected
                                in several projects

      4
                             Cambridge, February, 2000
Position

• Constraint specification

• Semi-automated test production

• Fully automated test execution and test result analysis

• Orientation on use in industrial software development

processes




5
                       Cambridge, February, 2000
Research and design problems

• Test system architecture
• Mapping between specification and programming languages
• Integration of generated and manual components -
    re-use of manual components

• Test sequence and test case generation




6
                       Cambridge, February, 2000
Verification processes


 • Reverse engineering: (post-) specification, testing based on
   the specification
 • Forward engineering: specification design, development, test
   production
 • Co-verification: specification design, simultaneous
   development and test production




   7
                          Cambridge, February, 2000
Reverse engineering: Technology stream
                        Phase 1
                                                Software contract
   Documentation       Interface                contents
                       definition               Interface A1
                                                   …………………….
      Source code                               Interface A2
                                                   …………………….
                        Phase 2

                    Specification
                                                Actual documentation

    Test drivers        Phase 3
   and test cases
                       Test suite
                      production

                        Phase 4
                                                Detected error &
      Test plans    Test execution              test coverage
                       analysis                 reports
  8
                    Cambridge, February, 2000
Key features of KVEST test suites

• Phase 1: Minimal and orthogonal API (Application Programming
  Interface) is determined

• Phase 2: Formal specification in RAISE Specification Language is
  developed for API.

• Phase 3: Automatic generation of sets of test suites (test cases and
  test sequences) in target language.

• Phase 4: Automatic execution of generated test suites. Pass/fail
  verdict is assigned for every test case execution. Error summary is
  provided at the end of the run. User has an option of specifying
  completeness of the test coverage and the form of tracing.


    9
                             Cambridge, February, 2000
An example of specification in RAISE
 DAY_OF_WEEK : INT >< INT -~-> RC >< WEEKDAY
 DAY_OF_WEEK( tday, tyear ) as ( post_rc, post_Answer )
 post
       if     tyear <= 0 \/ tday <= 0 \/
              tday > 366 \/ tday = 366
              /\ ~a_IS_LEAP( tyear )
       then
              BRANCH( bad_param, "Bad parameters" );
              post_Answer = 0 /\ post_rc = NOK
       else
              BRANCH( ok, "OK" );
              post_Answer = (a_DAYS_AFTER_INITIAL_YEAR(tyear, tday ) +

              a_INITIAL_DAY_OF_WEEK ) \
              a_DAYS_IN_WEEK /\ post_rc = OK
       end

  10
                                    Cambridge, February, 2000
  Partition based on the specification

          Specification                                          Partition
post                                                           (Branches and
   if     a \/ b \/                         Full Disjunctive Normal Forms - FDNF)
          c \/ d
          /\ e                             BRANCH "Bad parameters”
   then                                    •a/\b/\c/\d/\e
          BRANCH( bad_param,               •~a/\b/\c/\d/\e
              "Bad parameters" )
                                           •...
   else
          BRANCH( ok, "OK" )               BRANCH "OK"
                                           •~a/\~b/\~c/\~d/\e
  end                                      •…




   11
                                   Cambridge, February, 2000
Test execution scheme
              Specifications

                      Test suite generators
       UNIX
       Test harness
                                 Test case parameters

              Test drivers
                   Program
                behavior model                       SUT
                                                           Target platform
                  Comparison




                   Verdict
                  and trace
  12
                             Cambridge, February, 2000
Test execution management
 Unix workstation                                                     Target platform

                                                 Test suite
        Navigator:
        - test suite generation
        - repository browser                              Script driver
        - test plan run



                                                      MDC             Basic drivers



    Repository
                                                     Test bed:
                                                      - process control
                                                      - communication
                                                      - basic data conversion



          MDC - Manually Developed Components

   13
                                  Cambridge, February, 2000
KVEST Test Drivers

• Hierarchy of Test Drivers
   — Basic test drivers: test single procedure by receiving input, calling the
       procedure, recording the output, assigning a verdict

   — Script drivers: generate sets of input parameters, call basic drivers,
       evaluate results of test sequences, monitor test coverage

   — Test plans: define the order of script driver calls with given test options
       and check their execution

• KVEST uses set of script driver skeletons to generate script
  drivers

• Test drivers are compiled from RAISE into the target language


  14
                               Cambridge, February, 2000
Test generation scheme
                                                            Script driver
             RAISE specifications                            skeletons


             Basic driver                                 Script driver
              generator                                    generator

  Tools (UNIX)             Test case generator


                    RAISE -> target language compiler


  Target platform

          Basic drivers         Test case                  Script drivers
                               parameters
        Test suites

  15
                              Cambridge, February, 2000
Test generation scheme, details
                                           Script driver skeletons
                                                                     Manually
               RAISE                                                 Developed
                                                                     Components
            specifications
                                                                     Data converters

               Basic driver                                             Iterators
                generator
                                         Filters                     State observers

                         Test case
Tools (UNIX)             generator
                                               Script driver
                                                generator

                 RAISE -> target language compiler

Target platform
                Basic          Test case             Script
               drivers        parameters             drivers
         Test suites
    16
                                     Cambridge, February, 2000
Test sequence generation based on
implicit Finite State Machine (FSM)
       – Partition based on pre- and post-conditions
       – Implicit FSM definition.
                                        op2
                    S1
                                op1
                                                   S2
                   op2

                              op3

             S4                         op3

                                            S3
                     op3


                                      op3




  17
                           Cambridge, February, 2000
 Test sequence generation based on
 implicit FSM

                                                    Partition
                                                  (Branches and
                                         Full Disjunctive Normal Forms -
                   op21
     S1                                               FDNF)
             op1
                          S2
     op2                               BRANCH "Bad parameters”
            op3                        •a/\b/\c/\d/\e        -- op1
                    op3                •~a/\b/\c/\d/\e       -- op2
S4
                                       •...
                     S3
      op3
                                       BRANCH "OK"
                   op3                 •~a/\~b/\~c/\~d/\e     -- opi
                                       •…




     18
                               Cambridge, February, 2000
Conclusion on KVEST experience

• Code inspection during formal specification can detect up to 1/3 of
  the errors
• Code inspection can not replace testing. Up to 2/3 of the errors are
  detected during and after testing.
• Testing is necessary to develop correct specifications.
• Up to 1/3 of the errors were caused by the lack of knowledge on pre-
  conditions and some details of the called procedures’ behavior.




    19
                           Cambridge, February, 2000
What part of testware is generated automatically?


    Kind of source for   Percen- Ratio between
     test generation     tage in source size and     Kind of
                         the     generated tests generation result
                         sources      size

   Specification           50                1:5        Basic drivers



   Data converters,        50               1:10        Script drivers
   Iterators and
   State observers
   (MDC)




   20
                            Cambridge, February, 2000
Solved and unsolved problems in test automation


Have been automated or                                 Not automated and
   simple problems                                         not simple
                              Phase 1
                             Interface
  For well designed          definition                For legacy software
                              Phase 2
  For single operations    Specification

                               Phase 3
  Test oracles,               Test suite                Test sequence design
  partition, filters         production                 for operation groups

                              Phase 4
 Test plans,               Test execution              Test result
 execution and analysis,      analysis                 understanding
 browsing, reporting
   21
                           Cambridge, February, 2000
Specification based testing: problems and prospects

             Problems                                    Prospects
 • Lack of correspondence                  • Use an OO programming
   between any specification and                language specification
   programming languages
                                                extension and standard SDE
 • There is users’ resistance to                instead a specific SDE
   study any specification
   language and any additional
   SDE                                     • FSM extraction from implicit
                                                specification, FSM
 • Methodology of Test sequence
   generation                                   factorization

 • Testing methodologies for               • Research on Distributed
   specific software areas                      software specification and
                                                testing
     22
                             Cambridge, February, 2000
Part II. KVEST revision
Specification notation revision.
UniTesK: Universal TEsting and Specification toolKit
 • Formal methods deployment problems
    — lack of users with theoretical background
    — lack of tools
    — non conventional languages and paradigms
 • UniTesK Solutions
    — first step is possible without “any theory”
    — extension of C++ and Java
    — integration with standard software development environment
 • Related works
    — ADL/ADL2
    — Eiffel, Larch, iContract


   24
                                 Cambridge, February, 2000
UniTesK: Test generation scheme
          Specifications in Java or C++                   Path builder
                    extension                               engines

                                                               Iterators,
                                                                  FSM
              Test oracles
               generator                                          Use cases


  Tools                              OO test suite
                                      generator




                          Test suites in the target language

           Test oracles                                           Iterators,
                                  Test sequence fabric
                                                                     FSM

  Target platform
  25
                              Cambridge, February, 2000
Integration of Constraint Verification tools
into software development environment
                    A standard Software Development Environment

UML based design environment




                                     Specification, Verification tools
                                     for the standard notation




   26
                       Cambridge, February, 2000
Part III. Test generation inside
Requirements. Test coverage criteria

          – All branches

          – All disjuncts (all accessible disjuncts)

          Specification
                                                          Partition
post                                                    (Branches and
   if     a \/ b \/                         Full Disjunctive Normal Forms - FDNF)
          c \/ d
          /\ e                             BRANCH "Bad parameters”
   then                                    •a/\b/\c/\d/\e
          BRANCH( bad_param,               •~a/\b/\c/\d/\e
              "Bad parameters" )           •...
   else
          BRANCH( ok, "OK" )               BRANCH "OK"
                                           •~a/\~b/\~c/\~d/\e
  end                                      •…




   28
                                   Cambridge, February, 2000
Test sequence kinds. Kinds 1st, 2nd, 3rd

   Such procedures can be tested separately because no other target
     procedure is needed to generate input parameters and analyze
     outcome.

   — Kind 1. The input is data that could be represented in literal (text)
     form and can be produced without accounting for any
     interdependencies between the values of different parameters..

   — Kind 2. No interdependencies exist between the input items
     (values of input parameters). Input does not have to be in literal
     form.

   — Kind 3. Some interdependencies exist, however separate testing
     is possible.

  29
                           Cambridge, February, 2000
Kinds 1st, 2nd, 3rd. What are automated?

        Kind   Automatically                  Manually


         1st     Everything                    Nothing

               Test sequences
                                              Data type
        2nd    and Parameter
                                              iterators
                tuple iterators

                                              Parameter
         3rd   Test sequences
                                            tuple iterators



  30
                Cambridge, February, 2000
Test sequence kinds. Kinds 4th and 5th



       Kinds 4th and 5th. The operations cannot be tested
         separately, because some input can be produced
         only by calling another operation from the group
         and/or some outcome can be analyzed only by
         calling other procedures.




  31
                       Cambridge, February, 2000
Requirements for kinds 4th and 5th



      The same requirements: all branches/all disjuncts

      Additional problem: how to traverse all states?




 32
                        Cambridge, February, 2000
FSM use for API testing

       Traditional FSM approach (explicit FSM definition):
       — define all states
       — for each state define all transitions (operation, input
         parameters, outcome, next state)

       ISPRAS approach (implicit FSM definition):
       — the state is defined by type definition
       — for each state
          - operations and input are defined by pre-conditions
          - outcome and next state are defined by post-conditions



  33
                            Cambridge, February, 2000
Advanced FSM use


  — FSM factorization
  — Optimization of exhaustive FSM traversing
  — Use-case based test sequence generation
  — Test scenario modularization
  — Friendly interface for test sequence generation
    and debugging



 34
                    Cambridge, February, 2000
References

       – Igor Bourdonov, Alexander Kossatchev, Alexander
         Petrenko, and Dmitri Galter. KVEST: Automated Generation
         of Test Suites from Formal Specifications.- Proceedings of
         World Congress of Formal Methods, Toulouse, France,
         LNCS, N 1708, 1999, pp.608-621.

       – Igor Burdonov, Alexander Kosachev, Victor Kuliamin. FSM
         using for Software Testing. Programming and Computer
         Software, Moscow-New-York, No. 2, 2000.




  35
                         Cambridge, February, 2000
Contacts

Alexander Petrenko
   Institute for System Programming of Russian Academy of Sciences
   (ISP RAS),
   Moscow, Russia
   petrenko@ispras.ru
   phone:         +7 (095) 912-5317 ext 4404
   fax:           +7 (095) 912-1524
  http://www.ispras.ru/~RedVerst/index.html




  36
                           Cambridge, February, 2000

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:9
posted:3/26/2010
language:English
pages:37