Documents
Resources
Learning Center
Upload
Plans & pricing Sign in
Sign Out

ISTQB dumps

VIEWS: 536 PAGES: 78

									              Certifi Tester
              C     ied    r

          dation Lev Sy
      Found               us
               n vel yllabu




                    Released
                    R
                     rsion 201
                   Ver       11




  ternatio
Int                          g      fication Board
         onal Software Testing Qualif      ns   r
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s


Copyrigh Notice
       ht
       cument may be copied in its entirety, or extracts made, if the s
This doc                                                 m                        knowledged.
                                                                      source is ack

Copyrigh Notice © In
       ht           nternational Software Te              cations Boar (hereinafte called ISTQB®)
                                            esting Qualific          rd           er
       s            d                       national Softw
ISTQB is a registered trademark of the Intern            ware Testing Qualifications Board,
                                                                     g

Copyrigh © 2011 the authors for the update 2011 (Thomas Müller (ch
       ht         e           r                                  hair), Debra Friedenberg, and
       QB
the ISTQ WG Foun  ndation Level)

Copyrigh © 2010 the authors for the update 2010 (Thomas Müller (ch
        ht        e           r                                               Beer, Martin
                                                                 hair), Armin B
       Rahul Verma)
Klonk, R          )

Copyrigh © 2007 the authors for the update 2007 (Thomas Müller (ch
       ht          e          r                                               y         D
                                                                 hair), Dorothy Graham, Debra
       berg and Erik van Veenendaal)
Friedenb           k

Copyrigh © 2005, th authors (T
       ht         he           Thomas Mülle (chair), Re Black, Sig Eldh, Dorothy Graham,
                                          er          ex         grid
       lsen, Maaret Pyhäjärvi, G
Klaus Ol                                              k
                               Geoff Thompson and Erik van Veenenndaal).

         s
All rights reserved.

The auth
       hors hereby t             copyright to t Internatio
                    transfer the c            the                   re         ualifications Board
                                                         onal Softwar Testing Qu
                   rs            t                       I          he
(ISTQB). The author (as current copyright holders) and ISTQB (as th future cop pyright holder)
have agr            ollowing cond
       reed to the fo                         e:
                                 ditions of use

1) Any individual or training com
                     r                         use
                                  mpany may u this sylla               basis for a tra
                                                           abus as the b                         e
                                                                                     aining course if the
                                               ed
   authors and the ISTQB are acknowledge as the so         ource and coopyright owners of the sy yllabus
   and provided tha any adver
                      at                                   ning course m mention the syllabu only
                                   rtisement of such a train            may          n          us
        r            n                         on
   after submission for official accreditatio of the tra   aining mater rials to an IISTQB recognized
   Natio onal Board.
                      r
2) Any individual or group of in               ay          s           the
                                  ndividuals ma use this syllabus as t basis for articles, boo   oks, or
   othe derivative writings if th authors a
       er                          he          and the ISTQB are acknowledged a the sourc and
                                                                                     as          ce
   copy               s
        yright owners of the syllabus.
3) Any ISTQB-reco    ognized Natio             may         e           us            se
                                   onal Board m translate this syllabu and licens the syllab (or bus
        ranslation) to other parties.
   its tr            o




        2011
Version 2                                                        7
                                                       Page 2 of 78                             r-2011
                                                                                           31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s



             ory
Revision Histo
Version                        Date
                               D                              s
                                                        Remarks

      2011
ISTQB 2                        Effective 1-Ap
                               E            pr-2011                                      l
                                                        Certified Tester Foundation Level Syllabus
                                                        Maintena             e
                                                                 ance Release – see Appeendix E – Reelease
                                                        Notes
      2010
ISTQB 2                        E            Mar-2010
                               Effective 30-M                                            l
                                                        Certified Tester Foundation Level Syllabus
                                                        Maintena             e
                                                                 ance Release – see Appeendix E – Reelease
                                                        Notes
      2007
ISTQB 2                        01-May-2007
                               0         7                                               l
                                                        Certified Tester Foundation Level Syllabus
                                                        Maintena ance Releasee
      2005
ISTQB 2                        01-July-2005
                               0                                                         l
                                                        Certified Tester Foundation Level Syllabus
ASQF V2.2                      July-2003
                               J                        ASQF Sy  yllabus Foundation Level Version 2.2
                                                                 n           n
                                                        “Lehrplan Grundlagen des Softwa are-testens“
      2.0
ISEB V2                        25-Feb-1999
                               2                        ISEB Sof             ng         on          V
                                                                 ftware Testin Foundatio Syllabus V2.0
                                                        25 February 1999




        2011
Version 2                                                        7
                                                       Page 3 of 78                                r-2011
                                                                                              31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                                             International
      fied Teste
 Certif        er                                                                                                          Software Te esting
      ation Level Sy
 Founda            yllabus                                                                                                Qualifications Board
                                                                                                                          Q            s



             ents
Table of Conte
Acknowledgements..       ........................................                                      ................................................... 7 
                                                                ........................................
Introducttion to this SSyllabus............................                                            ................................................... 8 
                                                                ........................................
   Purpo of this Do
        ose              ocument ..........................                                            ................................................... 8 
                                                                ........................................
   The CCertified Test Foundatio Level in S
                         ter                on                 Software Testing ..............         ................................................... 8 
   Learnning Objective   es/Cognitive Level of Kno              owledge ..........................     ................................................... 8 
   The EExamination .    ........................................                                      ................................................... 8 
                                                                ........................................
   Accreeditation ........
                         ........................................                                      ................................................... 8 
                                                                ........................................
   Level of Detail...... ........................................                                      ................................................... 9 
                                                                ........................................
   How tthis Syllabus is Organized ..................                                                  ................................................... 9 
                                                                ........................................
1.  Fun  ndamentals o Testing (K
                          of               K2)................                                         ................................................. 10 
                                                                ........................................
   1.1      Why is Te   esting Necessary (K2) .....                                                    ................................................. 11 
                                                                ........................................
      1.1.1  Software Systems C              Context (K1) .......................................
                                                                )                                      ................................................. 11 
      1.1.2  Causes of Software Defects (K2 ....................................
                         s                  e                   2)                                     ................................................. 11 
      1.1.3  Role of Testing in S
                         f                  Software Dev       velopment, Maintenance and Operations (K2) ............... 11 
                                                                                  M
      1.1.4  Testing and Quality (K2) ...........
                        g                   y                                                          ................................................. 11 
                                                                ........................................
      1.1.5  How Much Testing is Enough? (K2) ................................                         ................................................. 12 
   1.2      What is Testing? (K2) ....................                                                 ................................................. 13 
                                                                ........................................
   1.3      Seven Testing Princip          ples (K2) .......                                           ................................................. 14 
                                                                ........................................
   1.4      Fundamental Test Pro            ocess (K1) ...                                             ................................................. 15 
                                                                ........................................
      1.4   Test Planning and C
        4.1                                  Control (K1) .......................................      ................................................. 15 
      1.4   Test An
        4.2              nalysis and D      Design (K1) .                                              ................................................. 15 
                                                                ........................................
      1.4   Test Im
        4.3            mplementatio and Execu
                                           on                  ution (K1).........................     ................................................. 16 
      1.4   Evaluating Exit Crit
        4.4                                 teria and Rep       porting (K1) .....................     ................................................. 16 
      1.4   Test Cl
        4.5              losure Activit      ties (K1) ......                                          ................................................. 16 
                                                                ........................................
   1.5      The Psych    hology of Testing (K2) ....                                                   ................................................. 18 
                                                                ........................................
   1.6      Code of E   Ethics ...............................                                         ................................................. 20 
                                                                ........................................
2.  Tes  sting Throug   ghout the Sof       ftware Life C     Cycle (K2) .........................     ................................................. 21 
   2.1      Software Developmen Models (K2 ....................................
                                            nt                  2)                                     ................................................. 22 
      2.1.1  V-mode (Sequentia Development Model) (K2) ..............
                         el                 al                                                         ................................................. 22 
      2.1.2  Iterative   e-incrementa Development Models (K2) .............
                                            al                                     (                   ................................................. 22 
      2.1.3  Testing within a Life Cycle Model (K2) ............................
                        g                   e                                                          ................................................. 22 
   2.2      Test Leve (K2) ............................
                        els                                                                            ................................................. 24 
                                                                ........................................
      2.2   Compo
        2.1             onent Testing (K2) ...........
                                            g                                                          ................................................. 24 
                                                                ........................................
      2.2   Integra
        2.2             ation Testing (K2) ............                                                ................................................. 25 
                                                                ........................................
      2.2   System Testing (K2 .................
        2.3            m                    2)                                                         ................................................. 26 
                                                                ........................................
      2.2   Acceptance Testing (K2)...........
        2.4                                 g                                                          ................................................. 26 
                                                                ........................................
   2.3      Test Type (K2) .............................
                        es                                                                             ................................................. 28 
                                                                ........................................
      2.3   Testing of Function (Functional Testing) (K2 .................
        3.1             g                  n                                       2)                  ................................................. 28 
      2.3   Testing of Non-func
        3.2             g                   ctional Softw     ware Characte         eristics (Non     n-functional T     Testing) (K2) ......... 28 
      2.3   Testing of Software Structure/A
        3.3             g                   e                 Architecture (Structural Te              esting) (K2) .............................. 29 
      2.3   Testing Related to Changes: Re
        3.4             g                                       e-testing and Regression Testing (K2 ........................... 29 
                                                                                   d                  n                   2)
   2.4      Maintenan Testing (
                         nce                 (K2) .............                                        ................................................. 30 
                                                                ........................................
3.  Sta Techniqu (K2)...........................
        atic            ues                                                                            ................................................. 31 
                                                                ........................................
   3.1      Static Tec  chniques and the Test Pr
                                            d                  rocess (K2) ......................      ................................................. 32 
   3.2      Review Process (K2) .....................                                                  ................................................. 33 
                                                                ........................................
      3.2   Activitie of a Form Review (K ...................................
        2.1              es               mal                  K1)                                     ................................................. 33 
      3.2   Roles a Respons
        2.2              and                sibilities (K1) .......................................
                                                               )                                       ................................................. 33 
      3.2   Types o Reviews (
        2.3              of                 (K2) ..............                                        ................................................. 34 
                                                                ........................................
      3.2   Succes Factors fo Reviews (K ...................................
        2.4             ss                 or                  K2)                                     ................................................. 35 
   3.3      Static Ana  alysis by Too (K2) ........
                                            ols                                                        ................................................. 36 
                                                                ........................................
4.  Tes Design Te
         st             echniques (K ................
                                           K4)                                                         ................................................. 37 
                                                                ........................................
   4.1      The Test Developmen Process (K ...................................
                                            nt                 K3)                                     ................................................. 38 
   4.2      Categorie of Test De
                        es                 esign Techniq        ques (K2) ........................     ................................................. 39 
        2011
Version 2                                                                        7
                                                                       Page 4 of 78                                                       r-2011
                                                                                                                                     31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                                             International
      fied Teste
 Certif        er                                                                                                          Software Te esting
      ation Level Sy
 Founda            yllabus                                                                                                Qualifications Board
                                                                                                                          Q            s


   4.3      Specificat     tion-based or Black-box T
                                               r                  Techniques (K3) .............
                                                                                     (                   ................................................. 40 
      4.3   Equivalence Partitio
        3.1                                   oning (K3) ...                                             ................................................. 40 
                                                                  ........................................
      4.3   Bounda Value An
        3.2                ary                nalysis (K3) ..                                            ................................................. 40 
                                                                  ........................................
      4.3   Decisio Table Tes
        3.3               on                  sting (K3) .....                                           ................................................. 40 
                                                                  ........................................
      4.3   State T
        3.4               Transition Testing (K3) ....                                                   ................................................. 41 
                                                                  ........................................
      4.3   Use Ca Testing (
        3.5               ase                 (K2)..............                                         ................................................. 41 
                                                                  ........................................
   4.4      Structure-     -based or Wh       hite-box Techniques (K4 ..................
                                                                                     4)                  ................................................. 42 
      4.4   Statem
        4.1              ment Testing a Coverag (K4) ............................
                                               and               ge                                      ................................................. 42 
      4.4   Decisio Testing an Coverage (K4) ...............................
        4.2               on                  nd                 e                                       ................................................. 42 
      4.4   Other S
        4.3               Structure-bas Techniqu (K1) ..........................
                                              sed                 ues                                    ................................................. 42 
   4.5      Experienc     ce-based Tec         chniques (K2 .....................................
                                                                  2)                                     ................................................. 43 
   4.6      Choosing Test Techni               iques (K2)....                                            ................................................. 44 
                                                                  ........................................
5.  Tes Management (K3) ..........................
         st                                                                                              ................................................. 45 
                                                                  ........................................
   5.1      Test Orga     anization (K2 ..................
                                             2)                                                          ................................................. 47 
                                                                  ........................................
      5.1.1  Test Organization a Independ     and                 dence (K2) ......................      ................................................. 47 
      5.1.2  Tasks o the Test L
                           of                Leader and T       Tester (K1) .......................      ................................................. 47 
   5.2      Test Planning and Est              timation (K3).......................................
                                                                  )                                      ................................................. 49 
      5.2   Test Planning (K2) ....................
        2.1                                                                                              ................................................. 49 
                                                                  ........................................
      5.2   Test Planning Activ
        2.2                                   vities (K3) .....                                          ................................................. 49 
                                                                  ........................................
      5.2   Entry C
        2.3               Criteria (K2) .....................                                            ................................................. 49 
                                                                  ........................................
      5.2   Exit Criteria (K2)........................
        2.4                                                                                              ................................................. 49 
                                                                  ........................................
      5.2   Test Es
        2.5                stimation (K2 .................
                                              2)                                                         ................................................. 50 
                                                                  ........................................
      5.2   Test St
        2.6                trategy, Test Approach (K ..................................
                                              t                   K2)                                    ................................................. 50 
   5.3      Test Prog     gress Monitor       ring and Con       ntrol (K2) .........................    ................................................. 51 
      5.3   Test Pr
        3.1                rogress Monitoring (K1) ..                                                    ................................................. 51 
                                                                  ........................................
      5.3   Test Re
        3.2                eporting (K2)...................                                              ................................................. 51 
                                                                  ........................................
      5.3   Test Co
        3.3                ontrol (K2).......................                                            ................................................. 51 
                                                                  ........................................
   5.4      Configura     ation Manage        ement (K2) ...                                             ................................................. 52 
                                                                  ........................................
   5.5      Risk and T     Testing (K2) ....................                                             ................................................. 53 
                                                                  ........................................
      5.5   Project Risks (K2) .....................
        5.1                t                                                                             ................................................. 53 
                                                                  ........................................
      5.5   Produc Risks (K2) ....................
        5.2               ct                                                                             ................................................. 53 
                                                                  ........................................
   5.6      Incident M    Management (K3) ............                                                   ................................................. 55 
                                                                  ........................................
6.  Too Support fo Testing (K2
         ol               or                   2).................                                       ................................................. 57 
                                                                  ........................................
   6.1      Types of T     Test Tools (K ...............
                                              K2)                                                        ................................................. 58 
                                                                  ........................................
      6.1.1  Tool Su       upport for Te      esting (K2) ...                                            ................................................. 58 
                                                                  ........................................
      6.1.2  Test To Classifica
                           ool                ation (K2) .....                                           ................................................. 58 
                                                                  ........................................
      6.1.3  Tool Su       upport for Ma      anagement o Testing an Tests (K1) ............................................... 59 
                                                                 of                 nd                   )
      6.1.4  Tool Su       upport for Sta Testing (K1) ................................
                                               atic                                                      ................................................. 59 
      6.1.5  Tool Su       upport for Te Specificat
                                              est                 tion (K1) ..........................   ................................................. 59 
      6.1.6  Tool Su       upport for Te Execution and Loggin (K1) .........
                                              est                n                  ng                   ................................................. 60 
      6.1.7  Tool Su       upport for Pe      erformance a Monitorin (K1).........
                                                                 and                 ng                  ................................................. 60 
      6.1.8  Tool Su       upport for Sp      pecific Testin Needs (K1 .................
                                                                 ng                  1)                  ................................................. 60 
   6.2      Effective U of Tools Potential B
                           Use               s:                  Benefits and Risks (K2) ..              ................................................. 62 
      6.2   Potential Benefits a Risks of Tool Suppor for Testing (for all tools (K2) ................... 62 
        2.1                                   and                                    rt                                    s)
      6.2   Special Considerations for Som Types of Tools (K1) ....
        2.2                                                     me                   T                   ................................................. 62 
   6.3      Introducin a Tool into an Organiz
                          ng                  o                  zation (K1) .......................     ................................................. 64 
7.  References ......      ........................................                                      ................................................. 65 
                                                                  ........................................
   Standdards ............ ........................................                                      ................................................. 65 
                                                                  ........................................
   Bookss...................
                           ........................................                                      ................................................. 65 
                                                                  ........................................
8.  Appendix A – S        Syllabus Background .......                                                    ................................................. 67 
                                                                  ........................................
   Histor of this Doc
        ry                 cument ............................                                           ................................................. 67 
                                                                  ........................................
   Objecctives of the F    Foundation C       Certificate Qualification ......................           ................................................. 67 
   Objecctives of the I    International Qualification (adapted frn                  rom ISTQB m        meeting at So        ollentuna,
   Novem mber 2001)..      ........................................                                      ................................................. 67 
                                                                  ........................................
   Entry Requiremen for this Qu
                          nts                  ualification ...                                          ................................................. 67 
                                                                  ........................................
        2011
Version 2                                                                         7
                                                                        Page 5 of 78                                                      r-2011
                                                                                                                                     31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                                           International
      fied Teste
 Certif        er                                                                                                        Software Te esting
      ation Level Sy
 Founda            yllabus                                                                                              Qualifications Board
                                                                                                                        Q            s


  Backgground and H    History of the Foundation Certificate in Software T
                                           e                  n                                      Testing ..................................... 68 
9.   Appendix B – L    Learning Obje        ectives/Cogn       nitive Level of Knowledge ............................................... 69 
                                                                                  o                   e
  Level 1: Remember (K1) ............................                                                 ................................................. 69 
                                                               ........................................
  Level 2: Understand (K2) ...........................                                                ................................................. 69 
                                                               ........................................
  Level 3: Apply (K3 .....................................
                       3)                                                                             ................................................. 69 
                                                               ........................................
  Level 4: Analyze (    (K4) .................................                                        ................................................. 69 
                                                               ........................................
10.     A
        Appendix C – Rules App            plied to the IS     STQB ...............................    ................................................. 71 
  Founddation Syllab ...................................
                       bus                                                                            ................................................. 71 
                                                               ........................................
     10.   Genera Rules ...........................
       .1.1            al                                                                             ................................................. 71 
                                                               ........................................
     10.   Current Content ........................
       .1.2                                                                                           ................................................. 71 
                                                               ........................................
     10.   Learnin Objectives ..................
       .1.3            ng                   s                                                         ................................................. 71 
                                                               ........................................
     10.   Overall Structure .......................
       .1.4             l                                                                             ................................................. 71 
                                                               ........................................
11.     Appendix D – Notice to T
        A                                  Training Prov      viders ..............................   ................................................. 73 
12.     Appendix E – Release Notes.............
        A                                                                                             ................................................. 74 
                                                               ........................................
  Relea 2010 ......
       ase              ........................................                                      ................................................. 74 
                                                               ........................................
  Relea 2011 ......
       ase              ........................................                                      ................................................. 74 
                                                               ........................................
13.     Index ...........
                        ........................................                                      ................................................. 76 
                                                               ........................................




        2011
Version 2                                                                       7
                                                                      Page 6 of 78                                                      r-2011
                                                                                                                                   31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s




    owledgements
Ackno
         onal Softwar Testing Qu
Internatio          re           ualifications Board Working Group Fooundation Leevel (Edition 2011):
                                             The        m                       m
Thomas Müller (chair), Debra Friedenberg. T core team thanks the review team (Dan Almog       g,
Armin Be Rex Black, Julie Gar
         eer,                                            la        en,
                                rdiner, Judy McKay, Tuul Pääkköne Eric Riou du Cosquier Hans  r
Schaefer, Stephanie Ulrich, Erik van Veenendaal) and all National Booards for the suggestions for
the curre version o the syllabus.
        ent        of

         onal Softwar Testing Qu
Internatio           re          ualifications Board Working Group Fo  oundation Le evel (Edition 2010):
Thomas Müller (chair), Rahul Verma, Martin K   Klonk and Ar           The           m
                                                           rmin Beer. T core team thanks the
review te
        eam (Rex Bla            Bruhn-Peders
                     ack, Mette B                         F                        n,
                                              son, Debra Friedenberg, Klaus Olsen Judy McKa      ay,
Tuula Päääkkönen, M              ma,
                   Meile Posthum Hans Sc                  phanie Ulrich, Pete William Erik van
                                              chaefer, Step                         ms,
Veenend                          ards for their suggestions
        daal) and all National Boa             r           s.

         onal Softwar Testing Qu
Internatio          re         ualifications Board Working Group Fo          evel (Edition 2007):
                                                                  oundation Le
                               Graham, Deb Friedenberg, and Erik van Veenendaal. The core
Thomas Müller (chair), Dorothy G            bra                   k                       c
team tha             ew        ans          r,                    e
        anks the revie team (Ha Schaefer Stephanie Ulrich, Meile Posthuma, Anders
                    nil        d                                  ggestions.
Pettersson, and Won Kwon) and all the National Boards for their sug

Internatio          re         ualifications Board Working Group Fo
         onal Softwar Testing Qu                                              evel (Edition 2005):
                                                                   oundation Le
                               k,           h,        G           us
Thomas Müller (chair), Rex Black Sigrid Eldh Dorothy Graham, Klau Olsen, Ma   aaret Pyhäjärrvi,
Geoff Th            d
        hompson and Erik van Ve              nd        w           all
                               eenendaal an the review team and a National B  Boards for their
suggestions.




        2011
Version 2                                                        7
                                                       Page 7 of 78                            r-2011
                                                                                          31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s



     duction to this Syllabus
Introd     n       s
    ose     s      ent
Purpo of this Docume
         abus forms t basis for the International Softwar Testing Qualification a the Founda
This sylla           the                                  re                     at            ation
         he
Level. Th Internatio             e                       B           B)
                    onal Software Testing Qualifications Board (ISTQB provides it to the Natio onal
        for
Boards f them to accredit the tr              ders and to derive examination quest
                                 raining provid           d                      tions in their local
         e.
language Training p                          appropriate te
                    providers will determine a                       hods and pro
                                                          eaching meth           oduce course  eware
        editation. The syllabus w help candidates in their preparation for the exa
for accre                       will                                             amination.
Information on the history and ba             f           s          nd          dix
                                 ackground of the syllabus can be foun in Append A.

    Certified T
The C                                           e
              Tester Foundation Level in Software Testing
                       el        on          at
The Foundation Leve qualificatio is aimed a anyone inv     volved in soft             g.
                                                                         tware testing This includ des
people in roles such as testers, te analysts, test enginee test cons
         n                        est        ,             ers,          sultants, test managers, user
                                                                                      t
         nce          and
acceptan testers a software developers. This Founda                     qualification i also appro
                                                            ation Level q             is           opriate
                       ts
for anyone who want a basic un                                          h
                                 nderstanding of software testing, such as project m  managers, quality
manager software developmen managers, business an
         rs,                     nt                                      rectors and m
                                                           nalysts, IT dir            management
consultaants. Holders of the Foundation Certif                           n
                                             ficate will be able to go on to a higher              are
                                                                                      r-level softwa
        qualification.
testing q

     ing   ctives/Co
Learni Objec                         K       e
                   ognitive Level of Knowledge
       g            are       d                          s          d             s
Learning objectives a indicated for each section in this syllabus and classified as follows:
o K1: rremember
o K2: uunderstand
o K3: aapply
o K4: aanalyze
        details and e
Further d                       learning obje
                    examples of l                                   endix B.
                                            ectives are given in Appe

         s                        st         pter   gs          emembered (
All terms listed under “Terms” jus below chap heading shall be re         (K1), even if not
         y                        ng         s.
explicitly mentioned in the learnin objectives

    Examinatio
The E        on
                     el                     n          ed         yllabus. Answ
The Foundation Leve Certificate examination will be base on this sy           wers to
examina              ns                     of
       ation question may require the use o material ba           e                        s
                                                       ased on more than one section of this
                     s             bus      minable.
syllabus. All sections of the syllab are exam
The form of the exa
       mat                                oice.
                  amination is multiple cho
        may          n           an
Exams m be taken as part of a accredited training cou
                                            d                       n            ntly
                                                        urse or taken independen (e.g., at an  a
examina              or                     mpletion of an accredited training cou
        ation center o in a public exam). Com           a           d            urse is not a pre-
        e            m.
requisite for the exam

     ditation
Accred
        QB
An ISTQ National B                                       s             rse
                    Board may accredit training providers whose cour material f     follows this
                     oviders shou obtain acc
syllabus. Training pro          uld          creditation guidelines from the board or body that  t
performs the accreditation. An ac
         s                                  urse is recog
                                ccredited cou            gnized as con nforming to this syllabus, and
        ed          n
is allowe to have an ISTQB exa  amination as part of the course.

        guidance for training prov
Further g                                     en        dix
                                 viders is give in Append D.




        2011
Version 2                                                        7
                                                       Page 8 of 78                               r-2011
                                                                                             31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s



Level of Detail
        el                        s            rnationally co
The leve of detail in this syllabus allows inter            onsistent teaching and ex  xamination. In
order to achieve this goal, the syllabus consis of:
                                                sts
o General instructional objectiv describin the intentio of the Fou
                                  ves          ng            on          undation Lev  vel
o A list of informati to teach, including a d
                      ion                                    a          ces
                                                description, and referenc to additio  onal sources if
    requuired
o Lear                ves
        rning objectiv for each knowledge a                 bing the cogn
                                                area, describ            nitive learning outcome and
                                                                                                  a
    minddset to be acchieved
                      at
o A list of terms tha students m               e             d
                                  must be able to recall and understand  d
o A de                the
        escription of t key conc                             s           h            ed
                                  cepts to teach, including sources such as accepte literature or  o
    standards

The sylla              t
        abus content is not a des              he         owledge area of software testing; it ref
                                 scription of th entire kno          a                           flects
                       be         n            n           ing
the level of detail to b covered in Foundation Level traini courses.

     his    bus   rganized
How th Syllab is Or
        re
There ar six major c                         heading for each chapter shows the h
                     chapters. The top-level h                                   highest level of
learning objectives th is covere within the chapter and specifies the time for the chapter. Fo
                     hat        ed                                   e           e            or
examplee:


     sting Thr
2. Tes              t       ftware Life Cycle (K2)
             roughout the Sof                                                               nutes
                                                                                      115 min
This hea                          r                      es
        ading shows that Chapter 2 has learning objective of K1 (ass  sumed when a higher level is
shown) a K2 (but n K3), and it is intended to take 115 minutes to teach the material in the
         and          not                     d           5                                    e
chapter. Within each chapter there are a num              ons. Each se
                                             mber of sectio                        as
                                                                      ection also ha the learning
objective and the am
        es                        e           Subsections that do not h
                      mount of time required. S                       have a time ggiven are included
                      e
within the time for the section.




        2011
Version 2                                                        7
                                                       Page 9 of 78                             r-2011
                                                                                           31-Mar
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                            International
      fied Teste
 Certif        er                                                                         Software Te esting
      ation Level Sy
 Founda            yllabus                                                               Qualifications Board
                                                                                         Q            s




1.                mentals of Test
             Fundam                    2)
                                ting (K2                                                55   utes
                                                                                       15 minu
     ing   ctives for Fundam
Learni Objec        r               f
                           mentals of Testing
       ectives identify what you will be able t do followin the compl
The obje                                      to          ng                     h
                                                                    letion of each module.

     hy                sary? (K2)
1.1 Wh is Testing Necess
       1
LO-1.1.1                   e,                        y
                   Describe with examples, the way in which a defect in sof                 ause harm to a
                                                                                ftware can ca          o
                           to         onment or to a company (K2)
                   person, t the enviro                          (
       2
LO-1.1.2           Distinguish between the root cau of a defec and its effe
                                                    use           ct             ects (K2)
       3
LO-1.1.3                  asons why testing is nece
                   Give rea                         essary by giv               es
                                                                 ving example (K2)
       4
LO-1.1.4                   e          g             uality assurance and give examples o how testing
                   Describe why testing is part of qu                           e           of
                           tes        r
                   contribut to higher quality (K2)
       5
LO-1.1.5                   and         e
                   Explain a compare the terms e                                e,          rresponding terms
                                                    error, defect, fault, failure and the cor
                                       ing
                   mistake and bug, usi examples (K2)s

     hat
1.2 Wh is Testing? (K2)
       1
LO-1.2.1                   he
                   Recall th common o                f
                                         objectives of testing (K1)
       2
LO-1.2.2           Provide examples for the objectiv of testing in different phases of th software life
                                                      ves         g                     he
                   cycle (K2
                           2)
       3
LO-1.2.3                   tiate testing f
                   Different             from debugg ging (K2)

      ven    ng      ples (K2)
1.3 Sev Testin Princip
       1
LO-1.3.1                   the      rinciples in te
                   Explain t seven pr             esting (K2)

      ndamenta Test Pro
1.4 Fun      al                )
                      ocess (K1)
       1
LO-1.4.1                   he                      activities and respective t
                   Recall th five fundamental test a            d            tasks from planning to closure
                   (K1)

      e                 sting (K2)
1.5 The Psychology of Tes        )
       1
LO-1.5.1                   he
                   Recall th psycholog
                                     gical factors t             e           ss          (K1)
                                                    that influence the succes of testing (
       2
LO-1.5.2                   t          t             and
                   Contrast the mindset of a tester a of a deve  eloper (K2)




        2011
Version 2                                              Page 10 of 78                                 ar-2011
                                                                                                 31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                            International
      fied Teste
 Certif        er                                                                         Software Te esting
      ation Level Sy
 Founda            yllabus                                                               Qualifications Board
                                                                                         Q            s




1.1            Why is Testing Necess
                            g      sary (K2
                                          2)                                                      tes
                                                                                           20 minut

Terms
       fect, error, fa
Bug, def                            mistake, qual
                     ailure, fault, m           lity, risk

1.1.1                 e       s         (K1)
               Software Systems Context (
       e            re            l           from busines applications (e.g., ban
Software systems ar an integral part of life, f          ss                                   sumer
                                                                                 nking) to cons
       s             .           e           an          ce          are          not
products (e.g., cars). Most people have had a experienc with softwa that did n work as
       d.
expected Software t               t          ctly
                     that does not work correc can lead to many pro oblems, including loss of
       time or busin
money, t                                      ld         se
                     ness reputation, and coul even caus injury or deeath.

1.1.2                 of      re      s
               Causes o Softwar Defects (K2)
         n
A human being can m                  or
                       make an erro (mistake), which produ    uces a defect (fault, bug) in the progra
                                                                                                     am
                      ent.           ct                      he            ay
code, or in a docume If a defec in code is executed, th system ma fail to do w                       ld
                                                                                        what it shoul do
(or do soomething it shouldn’t), ca                                        systems or d
                                    ausing a failure. Defects in software, s                       m
                                                                                       documents may
                                    cts
result in failures, but not all defec do so.

Defects occur because human be   eings are fallible and bec                       sure, complex
                                                          cause there is time press            x
code, co                         e,
       omplexity of infrastructure changing ttechnologies, and/or man system int
                                                                     ny            teractions.

Failures can be caus by enviro
                    sed          onmental con             w                       ion, magnetis
                                             nditions as well. For example, radiati            sm,
electroni fields, and pollution can cause faults in firmwar or influenc the execut
        ic                                                re          ce                       are
                                                                                  tion of softwa by
changing the hardwa conditions.
        g           are

1.1.3 Role of T                           ment, Main
               Testing in Software Developm                 and
                                                   ntenance a
Operattions (K2)
        s           systems and documentati can help to reduce th risk of problems occurring
Rigorous testing of s                         ion                    he
during operation and contribute to the quality of the software system, if the defects found are
                    d            o                                                  s
corrected before the system is re
        d                       eleased for operational us se.

Software testing may also be req
       e           y                       et         al           quirements, o industry-specific
                               quired to mee contractua or legal req           or
       ds.
standard

1.1.4                  and Qualit (K2)
               Testing a        ty
                                   sible to meas
With the help of testing, it is poss                           lity                      of
                                                 sure the qual of software in terms o defects fou       und,
for both f             nd
         functional an non-functi    ional softwar requireme
                                                 re           ents and char               e.g., reliabilit
                                                                            racteristics (e              ty,
                       maintainability and portab
usability, efficiency, m                         bility). For more information on non-fu  unctional tes sting
see Cha apter 2; for more informat  tion on software characte               Software Eng
                                                              eristics see ‘S             gineering –
         e
Software Product Qu   uality’ (ISO 9126).

Testing c give con
        can                       he           the                      w            cts.    rly
                     nfidence in th quality of t software if it finds few or no defec A proper
        d
designed test that paasses reduce the overall level of risk in a system When testing does find
                                  es                      k            m.                    d
                     of            re
defects, the quality o the softwar system inc             en
                                               creases whe those defe                d.
                                                                        ects are fixed

Lessons should be le
        s                                  ojects. By understanding the root causes of defec
                    earned from previous pro                                                 cts
                    cts,       es
found in other projec processe can be imp               ch
                                           proved, whic in turn sho                           ts
                                                                     ould prevent those defect from
reoccurring and, as a consequen                         o            tems. This is an aspect of
                               nce, improve the quality of future syst                       o
quality assurance.

Testing s           tegrated as o of the qu
        should be int           one                                 s             side develop
                                          uality assurance activities (i.e., alongs          pment
       ds,
standard training and defect annalysis).
        2011
Version 2                                              Page 11 of 78                                   ar-2011
                                                                                                   31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                     International
      fied Teste
 Certif        er                                                                  Software Te esting
      ation Level Sy
 Founda            yllabus                                                        Qualifications Board
                                                                                  Q            s


1.1.5                ch      g        gh? (K2)
               How Muc Testing is Enoug
         g
Deciding how much t                ough should take accoun of the leve of risk, inclu
                      testing is eno                       nt          el                       cal,
                                                                                    uding technic
        and         s
safety, a business risks, and p   project constr
                                               raints such as time and b
                                                           a           budget. Risk is discussed
                                                                                   k            d
         n
further in Chapter 5.

Testing s           de                       to                                           ut
        should provid sufficient information t stakeholders to make informed decisions abou the
        of         are         m            ed,
release o the softwa or system being teste for the next development step or h handover to
customeers.




        2011
Version 2                                              Page 12 of 78                           ar-2011
                                                                                           31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s




1.2            What is Testing? (K2)
                     s                                                                          tes
                                                                                         30 minut

Terms
Debugging, requirem                         esting, test objective
                  ment, review, test case, te            o

     round
Backgr
A common perceptio of testing i that it only consists of running tests i.e., execu
                   on            is          y                       s,                        ware.
                                                                                 uting the softw
This is p of testing but not all o the testing activities.
        part       g,            of          g

Test acti               before and af test exec
          ivities exist b           fter         cution. Thes activities in
                                                             se           nclude plann ning and conttrol,
          g
choosing test conditions, designing and exec                             ng
                                                cuting test cases, checkin results, ev              t
                                                                                       valuating exit
          reporting on the testing p
criteria, r                                                  er
                                    process and system unde test, and fi               completing closure
                                                                          inalizing or c
         s
activities after a test phase has b been complet                          s
                                                 ted. Testing also includes reviewing d documents
(including source cod and cond
                        de)                      c
                                    ducting static analysis.

Both dyn             g            esting can be used as a means for ac
        namic testing and static te                                             milar objective
                                                                     chieving sim             es,
                     rmation that c be used to improve both the syst
and will provide infor            can                    b                       sted and the
                                                                     tem being tes            e
developm ment and tes            ses.
                     sting process

Testing c have the following ob
        can         e            bjectives:
o Finding defects
o Gain              nce         e
       ning confiden about the level of qua ality
o Prov                          cision-making
       viding information for dec           g
o Prev venting defeccts

The thouught process and activitie involved in designing tests early in the life cycle (verifying the
                     s           es          n            t                          e
                                 elp         t            m
test basis via test design) can he to prevent defects from being intro oduced into c              ws
                                                                                    code. Review of
documen (e.g., req
        nts                      and
                     quirements) a the ident tification and resolution o issues also help to prev
                                                          d            of           o              vent
        appearing in the code.
defects a

Different viewpoints in testing tak different o
         t                        ke                         o           For
                                               objectives into account. F example, in developm      ment
         e.g., compon
testing (e           nent, integrattion and systtem testing), the main obbjective may be to cause as
many fai             ssible so that defects in th software are identified and can be fixed. In
         ilures as pos            t             he           a           d              e
         nce          the
acceptan testing, t main obje      ective may b to confirm that the system works as expected, to
                                               be
gain con                          he
        nfidence that it has met th requireme               e
                                               ents. In some cases the m main objectiv of testing may
                                                                                        ve
be to ass            ality
         sess the qua of the so   oftware (with no intention of fixing defe             e           n
                                                                           ects), to give information to
stakeholders of the r of releasing the syste at a given time. Maint
                     risk                      em           n                            ing
                                                                          tenance testi often incl   ludes
testing th no new d
         hat         defects have been introdu  uced during development of the chan
                                                             d                          nges. During
operational testing, the main obje                           s
                                  ective may be to assess system chara                  uch
                                                                           acteristics su as reliab bility or
availability.

Debugging and testin are differe Dynamic testing can show failure that are ca
                      ng           ent.           c                       es                      fects.
                                                                                       aused by def
Debugging is the dev  velopment ac                nds, analyzes and remov the cause of the failur
                                   ctivity that fin                       ves           e         re.
Subsequ               ng                          at           es
       uent re-testin by a tester ensures tha the fix doe indeed res      solve the failure. The
       ibility for thes activities i usually tes
responsi              se           is                          d          s
                                                  sters test and developers debug.

The proc            ng         esting activitie are explained in Section 1.4.
       cess of testin and the te              es




        2011
Version 2                                              Page 13 of 78                                 ar-2011
                                                                                                 31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s




1.3            Seven Testing Principles (K2)
                                           )                                                  tes
                                                                                       35 minut

Terms
Exhaustive testing

      ples
Princip
        er         principles ha been sug
A numbe of testing p           ave                 r
                                        ggested over the past 40 years and offer general
        es         for
guideline common f all testing g.

Principle 1 – Testing shows pre              defects
                                   esence of d
Testing c show tha defects are present, bu cannot pro that there are no defe
         can          at                      ut        ove          e                       g
                                                                                 ects. Testing
reduces the probability of undisco            cts       g            ware but, eve if no defec are
                                   overed defec remaining in the softw           en          cts
found, it is not a proo of correctn
                      of          ness.

Principle 2 – Exhau ustive testing is imposs  sible
Testing e            all         ons
         everything (a combinatio of inputs and precon
                                              s                         ot
                                                          nditions) is no feasible ex              ial
                                                                                     xcept for trivi
cases. In            haustive test
         nstead of exh                        alysis and priorities should be used to focus testing
                                 ting, risk ana                          d          o
efforts.

Principle 3 – Early ttesting
To find d                          vities shall be started as early as pos
        defects early, testing activ                                                   software or system
                                                                         ssible in the s           s
developmment life cycle, and shall be focused on defined objectives.
                                                             o

Principle 4 – Defect clustering
                      t
Testing e             e
        effort shall be focused pr                         cted and later observed d
                                  roportionally to the expec                                     y
                                                                                    defect density of
       s.
modules A small number of mod                 y            ost
                                 dules usually contains mo of the def               ered during pre-
                                                                        fects discove           p
        testing, or is responsible for most of t operation failures.
release t                                      the         nal

Principle 5 – Pestic cide paradox x
If the sam tests are repeated ov and over again, event
         me                       ver                     tually the sam set of tes cases will no
                                                                       me         st
         nd
longer fin any new d              overcome thi “pesticide paradox”, test cases nee to be regu
                      defects. To o           is                                  ed           ularly
reviewed and revised and new a different tests need to be written t exercise d
         d            d,          and                    o             to                       s
                                                                                  different parts of
the softw            em                      re
         ware or syste to find potentially mor defects.

Principle 6 – Testing is context dependent
                                  t           t
Testing i done differ
         is                       erent context For example, safety-critical softwa is tested
                    rently in diffe           ts.                                 are
         ly         -commerce s
differentl from an e-             site.

                  nce-of-errors fallacy
Principle 7 – Absen           s
Finding a fixing de
        and       efects does n help if the system built is unusable and does n fulfill the users’
                              not         e                        e          not
        nd
needs an expectatioons.




        2011
Version 2                                              Page 14 of 78                              ar-2011
                                                                                              31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                          International
      fied Teste
 Certif        er                                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                                             Qualifications Board
                                                                                                       Q            s




1.4            Fundam
                    mental T      ocess (K
                           Test Pro      K1)                                                                    tes
                                                                                                         35 minut

Terms
Confirma             ,             exit                     ression testin test basis test condit
        ation testing, re-testing, e criteria, incident, regr            ng,         s,             tion,
        erage, test da test execution, test log, test plan, test proced
test cove            ata,                                                                           e,
                                                                        dure, test policy, test suite test
        y
summary report, test tware

     round
Backgr
        st            t          s
The mos visible part of testing is test executi
                                              ion. But to be effective an efficient, t
                                                           e            nd           test plans sh
                                                                                                 hould
         ude         be
also inclu time to b spent on p   planning the tests, designing test cas            ng
                                                                        ses, preparin for execution
and eval              ts.
         luating result

The fund          st
       damental tes process co onsists of the following ma activities:
                                                         ain
       t          nd
o Test planning an control
o Test analysis and design
       t
       t
o Test implementa ation and exe ecution
                  criteria and re
o Evaluating exit c             eporting
o Test closure activities
       t

        h           equential, the activities in the process may overlap or take plac concurren
Although logically se            e                                     p             ce          ntly.
        g                         thin the context of the system and the project is u
Tailoring these main activities wit                                    e                         red.
                                                                                    usually requir

1.4.1                  nning and Control (
               Test Plan       d         (K1)
Test plan             activity of de
         nning is the a            efining the ob
                                                bjectives of te
                                                              esting and th specificatio of test ac
                                                                          he           on         ctivities
in order to meet the o              nd
                      objectives an mission.

Test con               ngoing activit of comparing actual pr
         ntrol is the on            ty                                   nst                      ng
                                                             rogress again the plan, and reportin the
         ncluding deviations from the plan. It in
status, in                                                   ng
                                                 nvolves takin actions neecessary to mmeet the misssion
and obje               e            order to contr testing, th testing activities shoul be monitor
         ectives of the project. In o            rol         he                       ld           red
througho the projec Test planning takes in account the feedback from monito
         out            ct.                     nto                     k                         ntrol
                                                                                      oring and con
         s.
activities

        nning and co
Test plan                       are        n           o            bus.
                   ontrol tasks a defined in Chapter 5 of this syllab

1.4.2                 alysis and Design (K
               Test Ana                  K1)
Test anaalysis and de             activity during which gene testing objectives are transformed into
                     esign is the a             g          eral                   e           d
                     ons
tangible test conditio and test c  cases.

The test analysis and design acti
                     d                                       ajor
                                    ivity has the following ma tasks:
o Revi  iewing the te basis (suc as require
                    est             ch                       ware integrity level1 (risk level), risk
                                                 ements, softw            y
                                   e,
   analysis reports, architecture design, inte               fications)
                                                 erface specif
o Evaluating testab                 est          d
                     bility of the te basis and test objects s
o Identifying and p                 st
                    prioritizing tes conditions based on an  nalysis of tes items, the specification
                                                                          st                          n,
   behaavior and stru              e
                     ucture of the software
o Desi                             gh
        igning and prioritizing hig level test c cases
o Identifying neces                               t
                     ssary test data to support the test con nditions and test cases
o Desi  igning the tes environme setup and identifying any required infrastructu and tools
                     st             ent          d                        d             ure
o Crea              ctional tracea
       ating bi-direc                            en
                                   ability betwee test basis and test cas ses

1
             ree
  The degr to which sof                     s
                              ftware complies or must comply with a set of stakeholder-sele
                                                             y             s                               and/or software-
                                                                                            ected software a              -based
system cha                    g.,
             aracteristics (e.g software com                ssessment, safe level, securit level, desired performance,
                                            mplexity, risk as              ety              ty
             or              are
reliability, o cost) which a defined to re                   ance of the soft
                                            eflect the importa                              keholders.
                                                                            tware to its stak
        2011
Version 2                                                   Page 15 of 78                                             ar-2011
                                                                                                                  31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s


1.4.3                 plementation and Ex
               Test Imp                 xecution (K1)
       plementation and executio is the activity where te procedur or scripts are specifie by
Test imp                       on                        est       res          s          ed
       ng          ases in a par
combinin the test ca                        r            ng                                 t
                               rticular order and includin any other information needed for test
       on,
executio the enviro            t
                   onment is set up and the tests are runn.

Test imp                           on
       plementation and executio has the fo                    or
                                                ollowing majo tasks:
o Fina alizing, implemmenting and prioritizing ttest cases (inncluding the identification of test data
                                                                                         n            a)
o Deve  eloping and prioritizing te procedure creating test data and optionally, preparing te
                                  est            es,           t            d,                        est
   harnnesses and w              mated test scr
                     writing autom               ripts
o Crea                tes
       ating test suit from the test procedu    ures for efficient test execcution
o Verif              e            nment has be set up co
        fying that the test environ              een           orrectly
o Verif fying and updating bi-dire              eability between the test basis and te cases
                                   ectional trace                                        est
o Exec  cuting test pr                           ly           g
                      rocedures either manuall or by using test execut                   ccording to th
                                                                            tion tools, ac            he
   planned sequenc    ce
o Logg  ging the outc                           nd                          s
                     come of test execution an recording the identities and versions of the sof       ftware
        er
   unde test, test to ools and testtware
o Com                 al           h
       mparing actua results with expected r    results
o Repo  orting discrepancies as in              d
                                   ncidents and analyzing th   hem in order to establish their cause (e.g.,
                                                                            r            h
   a deefect in the co                          a,                          or
                      ode, in specified test data in the test document, o a mistake in the way th test he
   was executed)
o Repe  eating test activities as a result of acttion taken for each discre epancy, for eexample, re-
   exec              est
       cution of a te that previo  ously failed in order to co              confirmation testing), exe
                                                              onfirm a fix (c                         ecution
                      st
   of a corrected tes and/or exe                 sts          t             at
                                  ecution of tes in order to ensure tha defects have not been
   intro
       oduced in unc  changed areas of the sof  ftware or that defect fixing did not unc  cover other
   defe               ion
       ects (regressi testing)

1.4.4                  ng      riteria and Reporting (K1)
               Evaluatin Exit Cr                   g
        ng                         vity     est                   d
Evaluatin exit criteria is the activ where te execution is assessed against the defined
objective This shou be done f each test level (see Section 2.2).
        es.         uld            for                S

Evaluatin exit criteria has the following majo tasks:
        ng                                   or
o Chec               gs          he           a            n            ng
        cking test log against th exit criteria specified in test plannin
o Asse               re
        essing if mor tests are n             the          ria                   hanged
                                 needed or if t exit criter specified should be ch
        ing
o Writi a test sum                rt
                      mmary repor for stakeho olders

1.4.5                  sure Activ
               Test Clos        vities (K1)
        sure activities collect data from comp
Test clos                          a         pleted test ac                           perience,
                                                           ctivities to consolidate exp
        e,
testware facts and n               st
                     numbers. Tes closure ac                r            milestones su as when a
                                              ctivities occur at project m            uch
        e
software system is re eleased, a te project is completed (o cancelled), a milestone has been
                                  est                       or
        d,
achieved or a mainte  enance relea has been completed.
                                   ase         n




        2011
Version 2                                              Page 16 of 78                                ar-2011
                                                                                                31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s


Test clos                          e
        sure activities include the following mmajor tasks:
o Chec                                          ve
         cking which planned deliverables hav been deliv     vered
o Clos  sing incident reports or ra             e           r
                                   aising change records for any that remmain open
o Docu                e            e
         umenting the acceptance of the syste   em
o Fina alizing and ar              ware, the tes environmen and the te infrastruct
                      rchiving testw           st            nt          est         ture for later reuse
o Hand                e            o
         ding over the testware to the mainten nance organi  ization
o Anal               ns           o
         lyzing lesson learned to determine c  changes nee               re         and
                                                            eded for futur releases a projects
        ng                          ed         ve
o Usin the information gathere to improv test maturity




        2011
Version 2                                              Page 17 of 78                             ar-2011
                                                                                             31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s




1.5            The Ps      gy
                    sycholog of Tes
                                  sting (K2)                                                   tes
                                                                                        25 minut

Terms
        essing, indep
Error gue           pendence

     round
Backgr
The mind                sed
         dset to be us while tes  sting and rev              ferent from th used whi developing
                                               viewing is diff            hat       ile
        e.              ght
software With the rig mindset d                are          est
                                   developers a able to te their own code, but se               t
                                                                                   eparation of this
responsi                ster                                 fort                   al          uch
         ibility to a tes is typically done to help focus eff and provide additiona benefits, su as
an indep               w           and
        pendent view by trained a professio     onal testing resources. In
                                                             r           ndependent t           b
                                                                                    testing may be
        out             vel
carried o at any lev of testing.

         n
A certain degree of in              ce             the
                       ndependenc (avoiding t author bias) often ma                       er
                                                                           akes the teste more effec   ctive
         g             d
at finding defects and failures. Ind dependence is not, howe
                                                  e            ever, a replac            amiliarity, an
                                                                             cement for fa            nd
         ers                         any                       c            al
develope can efficiently find ma defects in their own code. Severa levels of in                       e
                                                                                         ndependence can
         ed           n              ow
be define as shown here from lo to high:
o Test designed b the person(s) who wro the softw
         ts            by                         ote         ware under test (low level of independence)
         ts            by                         g.,          d
o Test designed b another person(s) (e.g from the development team)         t
         ts            by
o Test designed b a person(s) from a diff                      izational grou (e.g., an i
                                                   ferent organi             up                       t
                                                                                         independent test
        m)
     team or test spe                .,           r           ce
                       ecialists (e.g. usability or performanc test specia  alists)
         ts            by
o Test designed b a person(s) from a diff                      ization or com
                                                   ferent organi             mpany (i.e., outsourcing or
     certification by an external bo ody)

People a projects are driven by objectives. People tend to align the plans with the objectives set
        and                       y            .             d           eir
by manaagement and other stakeh                example, to find defects o to confirm that softwar
                                  holders, for e             f            or          m         re
        s                         t                                      ectives of testing.
meets its objectives. Therefore, it is important to clearly state the obje

Identifyin failures du
         ng          uring testing may be perc                            st          ct
                                              ceived as criticism agains the produc and agains the st
         As          esting is ofte seen as a destructive activity, even though it is very constru
author. A a result, te            en                         a           n           s             uctive
in the ma            of
         anagement o product ris  sks. Looking for failures in a system r             osity, profess
                                                                         requires curio            sional
         sm,                      on
pessimis a critical eye, attentio to detail, ggood commu    unication with developme peers, and
                                                                         h           ent
        nce                       or
experien on which to base erro guessing.

If errors, defects or fa            ommunicated in a constr
                       ailures are co                     ructive way, bad feelings between thee
testers a the analy
         and                       ers                   b
                      ysts, designe and developers can be avoided. T               to          und
                                                                      This applies t defects fou
during re             ell
         eviews as we as in testin  ng.

The teste and test le
         er         eader need g good interperrsonal skills to communic                          a
                                                                        cate factual information about
                                 constructive w
defects, progress and risks in a c                                                  or
                                              way. For the author of the software o document,
defect in           an          m            eir
        nformation ca help them improve the skills. Defe                             ing        w
                                                            ects found and fixed duri testing will
                     y
save time and money later, and r reduce risks..

Commun              blems may oc
        nication prob          ccur, particularly if testers are seen o
                                                                      only as messengers of
unwante news abou defects. However, ther are severa ways to im
        ed           ut                     re             al        mprove comm             nd
                                                                                 munication an
        ships betwee testers and others:
relations           en         d




        2011
Version 2                                              Page 18 of 78                                ar-2011
                                                                                                31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                           International
      fied Teste
 Certif        er                                                                        Software Te esting
      ation Level Sy
 Founda            yllabus                                                              Qualifications Board
                                                                                        Q            s


o     Start with collabo
           t                                        s          e           he
                        oration rather than battles – remind everyone of th common goal of bette      er
      quality systems
o     Commmunicate fin                 e
                        ndings on the product in a neutral, fac           way
                                                               ct-focused w without cr  riticizing the
          son
      pers who crea                                            nd          cident reports and review
                        ated it, for example, write objective an factual inc            s            w
      findings
o     Try t understand how the ot
           to                                      feels and why they react as they do
                                       ther person f
o     Conf                             n
           firm that the other person has unders                ou         d
                                                   stood what yo have said and vice ve ersa




        2011
Version 2                                              Page 19 of 78                                 ar-2011
                                                                                                 31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s




1.6                 of
               Code o Ethics
                           s                                                                tes
                                                                                     10 minut

Involvem                         enables indiv
        ment in software testing e                                                   eged informa
                                             viduals to learn confidential and privile          ation. A
        ethics is necessary, amo other rea
code of e                        ong                       ure                       s
                                             asons to ensu that the information is not put to
inapprop             ecognizing th ACM and IEEE code of ethics for engineers, th ISTQB states the
        priate use. Re            he         d                                       he
        g
following code of ethics:

                    oftware teste shall act consistently with the pub interest
PUBLIC - Certified so           ers                                 blic

CLIENT AND EMPLO      OYER - Cert              re          all       manner that is in the best interests
                                 tified softwar testers sha act in a m            s
of their c           mployer, cons
         client and em                         he          erest
                                 sistent with th public inte

PRODUC - Certified software te
        CT         d                      ensure that th deliverables they prov
                             esters shall e            he                                  p
                                                                              vide (on the products
       tems they tes meet the highest profe
and syst           st)                    essional stanndards possible

JUDGME             ed         testers shall maintain inte
       ENT- Certifie software t                                      dependence in their profe
                                                        egrity and ind                       essional
      nt
judgmen

MANAGEMENT - Ce                   are
                    ertified softwa test man nagers and leeaders shall subscribe to and promote an
ethical a
        approach to the managem              ware testing
                                 ment of softw

PROFES  SSION - Cer rtified softwar testers shall advance the integrity and reputatio of the pro
                                  re                                                on         ofession
                                  t
consistent with the public interest

COLLEAAGUES - Cer  rtified softwa testers sh be fair to and support
                                are        hall                                 colleagues, and
                                                                  tive of their c           a
                  n
promote cooperation with software developerrs

SELF - C             ware testers shall particip
       Certified softw                                       ng         regarding the practice of their
                                               pate in lifelon learning r           e
professio and shall promote an ethical appro
        on                                                   p                      n
                                               oach to the practice of the profession

     ences
Refere
                   Kaner, 2002
1.1.5 Black, 2001, K
       zer,
1.2 Beiz 1990, Bla             Myers, 1979
                    ack, 2001, M
       zer,
1.3 Beiz 1990, He               Myers, 1979
                   etzel, 1988, M
       zel,
1.4 Hetz 1988
                   Craig, 2002
1.4.5 Black, 2001, C
1.5 Blac 2001, Het
       ck,          tzel, 1988




        2011
Version 2                                              Page 20 of 78                             ar-2011
                                                                                             31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                             International
      fied Teste
 Certif        er                                                                          Software Te esting
      ation Level Sy
 Founda            yllabus                                                                Qualifications Board
                                                                                          Q            s




2. T       g
    Testing Throug
                 ghout th Softw
                        he           fe
                              ware Lif                                                     115
                                                                                           1 minutes
    e
Cycle (K2)
     ing   ctives for Testing Througho the S
Learni Objec        r                out            Life  e
                                           Software L Cycle
       ectives identify what you will be able t do followin the compl
The obje                                      to          ng                     h
                                                                    letion of each module.

      ftware Dev
2.1 Sof               nt       (K2)
               velopmen Models (
       1
LO-2.1.1                   the                                   nt,
                   Explain t relationship between developmen test activit                              n
                                                                               ties and work products in the
                   developm            cle,
                           ment life cyc by giving examples us                 and
                                                                  sing project a product types (K2)
       2
LO-2.1.2                                hat                      nt           ust
                   Recognize the fact th software developmen models mu be adapte to the coned          ntext
                   of projec and produc characteris
                           ct           ct          stics (K1)
       3
LO-2.1.3                   haracteristics of good tes
                   Recall ch            s           sting that are applicable t any life cy
                                                                 e             to         ycle model (K
                                                                                                      K1)

      st       (K2)
2.2 Tes Levels (
       1
LO-2.2.1                    e
                   Compare the differen levels of te
                                         nt                         r                        cts         ,
                                                       esting: major objectives, typical objec of testing,
                            argets of test
                   typical ta                          nctional or st
                                         ting (e.g., fun                         d           rk
                                                                    tructural) and related wor products, people
                            t,
                   who test types of de efects and failures to be id             2)
                                                                     dentified (K2

      st
2.3 Tes Types (K2)
       1
LO-2.3.1                    e
                   Compare four softwa test types (functional, non-functional, structura and chang
                                        are          s            ,                          al         ge-
                   related) by example (K2)
       2
LO-2.3.2           Recognize that funct              ructural tests occur at any test level (K1)
                                        tional and str            s
       3
LO-2.3.3                    and
                   Identify a describe non-functio
                                        e                        es
                                                    onal test type based on n  non-functional requireme ents
                   (K2)
       4
LO-2.3.4                    and
                   Identify a describe test types b
                                        e                        e
                                                    based on the analysis of a software sy               cture
                                                                                             ystem’s struc
                            ecture (K2)
                   or archite
       5
LO-2.3.5                    e           e
                   Describe the purpose of confirma  ation testing and regression testing (KK2)

              e         (K2)
2.4 Maintenance Testing (
       1
LO-2.4.1                   e
                   Compare maintenance testing (te                            m)
                                                      esting an existing system to testing a new application
                           pect to test ty
                   with resp                                                               K2)
                                         ypes, triggers for testing and amount of testing (K
       2
LO-2.4.2                                 s
                   Recognize indicators for mainten  nance testing (modificatio migration and retirement)
                                                                   g          on,
                   (K1)
       3.
LO-2.4.3                   e
                   Describe the role of r            esting and im
                                         regression te            mpact analysis in maintennance (K2)




        2011
Version 2                                              Page 21 of 78                                   ar-2011
                                                                                                   31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s




2.1            Softwa Deve
                    are  elopmen Models (K2)
                               nt                                                            tes
                                                                                      20 minut

Terms
Commer                -Shelf (COTS iterative-i
         rcial Off-The-          S),         incremental development model, validation,
verification, V-model

     round
Backgr
        does not exis in isolation test activiti are relate to softwar developme activities.
Testing d           st           n;            ies        ed         re        ent
        t           nt
Different developmen life cycle m             d
                                 models need different approaches to testing.

2.1.1                                   opment Mo
               V-model (Sequential Develo       odel) (K2)
       h                        l           mmon type of V-model us four test levels,
Although variants of the V-model exist, a com          f          ses
       onding to the four develop
correspo           e            pment levelss.

       r                          bus
The four levels used in this syllab are:
o Com               t)
      mponent (unit testing
o Integ             ng
       gration testin
o Syst tem testing
o Acce eptance testi ing

In practic a V-mode may have more, fewer or different levels of dev
         ce,         el                      r                                     nd
                                                                      velopment an testing,
         ng
dependin on the pro              e           roduct. For example, ther may be co
                     oject and the software pr                        re           omponent
         on          ter         ent
integratio testing aft compone testing, an system in
                                              nd                      sting after sy
                                                         ntegration tes                        g.
                                                                                   ystem testing

Software work produ
        e                        s
                    ucts (such as business sc cenarios or use cases, re
                                                          u           equirements s              ns,
                                                                                     specification
       documents an code) pro
design d             nd                                   ent                       f
                                 oduced during developme are often the basis of testing in on or ne
        st
more tes levels. Refferences for ggeneric work products include Capab
                                              k                                     y
                                                                      bility Maturity Model Integgration
        or
(CMMI) o ‘Software life cycle proocesses’ (IEE            07).
                                              EE/IEC 1220 Verification and valid                 e
                                                                                    dation (and early
test design) can be c
                    carried out du            velopment of the software work produ
                                  uring the dev           f            e            ucts.

2.1.2                   -incremen Develo
               Iterative-       ntal   opment Models (K2)
         -incremental developmen is the proc
Iterative-                         nt          cess of estab             irements, designing, build
                                                            blishing requi                           ding
         ing          m            of
and testi a system in a series o short deve    elopment cyc  cles. Examples are: proto otyping, Rapid
         ion
Applicati Developm     ment (RAD), Rational Un               s
                                               nified Process (RUP) and agile develo   opment mode Aels.
system t               ced          ese         may          ed
         that is produc using the models m be teste at several test levels d           during each
         .            ent,
iteration. An increme added to others deve     eloped previoously, forms a growing pa  artial system,
which sh              e
         hould also be tested. Reg                           singly import
                                   gression testing is increas                         erations after the
                                                                         tant on all ite            r
         .                          on
first one. Verification and validatio can be caarried out on each incremment.

2.1.3                  within a Life Cycle M
               Testing w                           2)
                                           Model (K2
         fe                                                o           ng:
In any lif cycle model, there are several characteristics of good testin
o For e              opment activi there is a corresponding testing ac
          every develo           ity                                   ctivity
         h           has
o Each test level h test objec                 ic          el
                                 ctives specifi to that leve
                      d
o The analysis and design of te  ests for a giv test level should begi during the correspondi
                                              ven                      in                         ing
    deve elopment act tivity
o Test               be          n
         ters should b involved in reviewing d            a
                                              documents as soon as dr  rafts are available in the
    deve elopment life cycle

Test leve can be co
        els         ombined or r reorganized ddepending on the nature of the projec or the syst
                                                                                   ct           tem
        ture. For exa
architect                        e
                    ample, for the integration of a Comme              e-Shelf (COT software
                                                          ercial Off-The           TS)
                    m,
product into a system the purcha aser may per             ation testing a the system level (e.g.,
                                              rform integra             at         m

        2011
Version 2                                              Page 22 of 78                              ar-2011
                                                                                              31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                   International
      fied Teste
 Certif        er                                                                Software Te esting
      ation Level Sy
 Founda            yllabus                                                      Qualifications Board
                                                                                Q            s


integratio to the infr
         on                                              stem deploym
                     rastructure and other systems, or sys                       cceptance tes
                                                                      ment) and ac           sting
        nal
(function and/or no               ,          nd/or operational testing)
                    on-functional, and user an                        ).




        2011
Version 2                                              Page 23 of 78                        ar-2011
                                                                                        31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s




2.2                  evels (K
               Test Le      K2)                                                              tes
                                                                                      40 minut

Terms
                     esting, comp
Alpha testing, beta te                        g,            d            nctional requ
                                 ponent testing driver, field testing, fun           uirement,
integratio integratio testing, no
         on,         on           on-functional requiremen robustness testing, stu system te
                                              l            nt,           s           ub,       esting,
                     st           driven development, use acceptance testing
test environment, tes level, test-d                        er            e

     round
Backgr
        h                         ollowing can b identified: the generic objectives, t work
For each of the test levels, the fo             be                       c           the
product(s) being refeerenced for d              cases (i.e., th test basis the test ob
                                  deriving test c             he         s),                       hat
                                                                                     bject (i.e., wh is
being tes             l           d            be            st                      and
        sted), typical defects and failures to b found, tes harness requirements a tool supp       port,
        cific
and spec approac     ches and responsibilities.

Testing a system’s configuration data shall be considered during test planning,
                                             e          d

2.2.1                         ng
               Component Testin (K2)
       sis:
Test bas
o Com  mponent requuirements
o Deta ailed design
o Code  e

Typical t
        test objects:
o Com  mponents
o Prog  grams
o Data conversion / migration p
        a                     programs
o Data  abase modules

Compon                also known a unit, modu or progra testing) searches for d
        nent testing (a           as         ule         am                                     nd
                                                                                   defects in, an
         the          ng         are                     o
verifies t functionin of, softwa modules, programs, objects, class   ses, etc., that are separattely
         .            one         ion
testable. It may be do in isolati from the rest of the sy ystem, depending on the context of the
                                                                                  e
developm                          ystem. Stubs drivers and simulators may be used
         ment life cycle and the sy          s,          d                         d.

Compon               may
        nent testing m include t testing of fun
                                              nctionality an specific no
                                                           nd          on-functional characterist
                                                                                    l           tics,
                                              or
such as resource-behavior (e.g., searching fo memory le                ustness testin as well as
                                                           eaks) or robu            ng,         s
structura testing (e.g decision c
        al           g.,        coverage). Te cases are derived fro work prod
                                               est          e          om                       s
                                                                                    ducts such as a
        ation of the component, th software design or the data model.
specifica                        he                         e

Typically component testing occ
        y,                                    ess        ode
                                curs with acce to the co being tes                h             t
                                                                     sted and with the support of a
developm ment environnment, such as a unit test framework or debugging tool. In practice, comp ponent
        usually involv the progr
testing u            ves        rammer who wrote the co  ode. Defects are typically fixed as soo as
                                                                                  y            on
                     out
they are found, witho formally m              ese
                                managing the defects.

One app proach to com mponent test ting is to prepare and aut           cases before coding. This is
                                                            tomate test c          e             s
                      proach or tes
called a test-first app           st-driven deve            his         h                       s
                                                 elopment. Th approach is highly iterative and is
based on cycles of developing te cases, the building and integratin small piec of code, and
         n                        est            en                     ng         ces           a
executing the compo                orrecting any issues and iterating unt they pass.
                      onent tests co             y                      til




        2011
Version 2                                              Page 24 of 78                             ar-2011
                                                                                             31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                         International
      fied Teste
 Certif        er                                                                      Software Te esting
      ation Level Sy
 Founda            yllabus                                                            Qualifications Board
                                                                                      Q            s


2.2.2                   on      g
               Integratio Testing (K2)
       sis:
Test bas
o Softw           stem design
       ware and sys
o Arch hitecture
o Workflows
o Use cases

Typical t
        test objects:
o Subs  systems
o Data  abase implem mentation
o Infraastructure
o Inter rfaces
o Syst  tem configura                         data
                      ation and configuration d

Integratio testing tests interfaces between components, interactions with differen parts of a
         on                                                                      nt
system, such as the operating sy              ystem and ha
                                 ystem, file sy                                  between syst
                                                         ardware, and interfaces b            tems.

There may be more t               vel
                      than one lev of integrat              a           e                           cts
                                               tion testing and it may be carried out on test objec of
varying s            ws:
        size as follow
1. Com mponent integ              ng
                     gration testin tests the in
                                               nteractions between softw
                                                            b            ware compo                s
                                                                                     onents and is done
        r
    after component testing
2. Syst               on                      actions betwe different systems or between
        tem integratio testing tests the intera              een         t
    harddware and so oftware and m be done after system testing. In this case, the developing
                                  may         e             m                                      g
    orgaanization may control only one side of the interface. This migh be conside
                      y            y           f                        ht                         k.
                                                                                      ered as a risk
    Business proces  sses impleme ented as worrkflows may involve a ser  ries of system Cross-platform
                                                                                      ms.
    issue may be si
         es           ignificant.

The grea the scop of integrat
       ater     pe          tion, the more difficult it becomes to is
                                                        b                       ts          fic
                                                                    solate defect to a specif
       ent      m,          y
compone or system which may lead to incr reased risk and additiona time for tro
                                                        a          al          oubleshootingg.

Systema integratio strategies may be bas on the sy
        atic         on                         sed                     ecture (such as top-down and
                                                            ystem archite                        n
        up),          al
bottom-u functiona tasks, tran    nsaction proc  cessing sequ
                                                            uences, or so
                                                                        ome other as             s
                                                                                    spect of the system
or compo onents. In or                            n         t            ly,         n
                      rder to ease fault isolation and detect defects earl integration should norrmally
        mental rathe than “big bang”.
be increm            er

Testing o specific no
         of                       l          tics                  may       ded       ration
                    on-functional characterist (e.g., performance) m be includ in integr
        as          nctional testin
testing a well as fun             ng.

At each stage of inte               ers
                     egration, teste concentr                 n            tion itself. Fo example, if they
                                                 rate solely on the integrat             or           f
are integ            ule
        grating modu A with mo                                ed
                                   odule B they are intereste in testing the commun                  ween
                                                                                         nication betw
the modu                           y
         ules, not the functionality of the indivi            e            s             g
                                                  idual module as that was done during component      t
         Both function and struc
testing. B           nal                                      e
                                   ctural approaches may be used.

Ideally, t                        d          ecture and inf
         testers should understand the archite                        gration plann
                                                          fluence integ                          ation
                                                                                  ning. If integra
        e
tests are planned before compon               ems are built, those com
                                 nents or syste                                   n
                                                                     mponents can be built in th he
order req            ost         testing.
         quired for mo efficient t




        2011
Version 2                                              Page 25 of 78                               ar-2011
                                                                                               31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s


2.2.3                 Testing (K
               System T        K2)
       sis:
Test bas
o Syst tem and softw            ement specification
                     ware require
o Use cases
o Func               fication
        ctional specif
       k
o Risk analysis rep ports

Typical t
        test objects:
o Syst  tem, user and operation m manuals
o Syst  tem configura                         data
                      ation and configuration d

System t             ncerned with the behavio of a whole system/prod
        testing is con          h           or                                 sting scope shall
                                                                   duct. The tes           s
        ly           d                       vel        n
be clearl addressed in the Master and/or Lev Test Plan for that test level.

In system testing, the test environ
        m            e            nment should correspond to the final target or pro
                                                        d                          oduction
environm                          e                     e
        ment as much as possible in order to minimize the risk of environment-spe                s
                                                                                   ecific failures not
        und
being fou in testing g.

System t                         s                       o                       ations, busine
         testing may include tests based on risks and/or on requirements specifica            ess
processe use case or other high level text descriptions or models of system be
         es,        es,                       t           s                      ehavior,
interactio with the operating sy
         ons                     ystem, and syystem resources.

System t                          e            and
         testing should investigate functional a non-func              rements of th system, and
                                                          ctional requir           he
        ality
data qua characte                 ers          d          h           e
                    eristics. Teste also need to deal with incomplete or undocum   mented
requirem             m
        ments. System testing of f                        s            ng
                                  functional requirements starts by usin the most a appropriate
        ation-based (
specifica            (black-box) te
                                  echniques fo the aspect of the syste to be teste For exam
                                              or          t           em            ed.         mple, a
                    be                                   s             n
decision table may b created for combinations of effects described in business ru  ules. Structure-
based teechniques (wwhite-box) ma then be us to asses the thoroughness of th testing with
                                  ay           sed       ss                        he
         to          al
respect t a structura element, s               u         o
                                 such as menu structure or web page n  navigation (s Chapter 4).
                                                                                   see

       pendent test team often c
An indep                                    ystem testing
                               carries out sy           g.

2.2.4                 nce    ng
               Acceptan Testin (K2)
       sis:
Test bas
        r
o User requiremen nts
o Syst            ments
       tem requirem
o Use cases
o Business proces sses
o Risk analysis rep
       k          ports

Typical t
        test objects:
o Business proces                              ystem
                     sses on fully integrated sy
o Operational and maintenance processes
                                   e
o User procedures
        r            s
o Form ms
o Repo   orts
o Conf               ata
         figuration da

                    s            esponsibility of the customers or user of a system other
Acceptance testing is often the re                                    rs          m;
                   e            s
stakeholders may be involved as well.

The goal in acceptan testing is to establish confidence in the system parts of th system or
                    nce         s             h                        m,          he         r
specific non-functional characteri             system. Find
                                 istics of the s                                   ain
                                                          ding defects is not the ma focus in
acceptan testing. A
         nce         Acceptance t                         s
                                 testing may assess the system’s read              eployment an
                                                                       diness for de          nd
        2011
Version 2                                              Page 26 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s


use, alth            ot           y            vel                                 cale system
        hough it is no necessarily the final lev of testing. For example, a large-sc
         on
integratio test may c come after th acceptanc test for a system.
                                  he           ce

                   may        t              es
Acceptance testing m occur at various time in the life cycle, for exa  ample:
o A CO                        ay
       OTS software product ma be accept                              nstalled or int
                                             tance tested when it is in             tegrated
o Acce             ing
       eptance testi of the usa              omponent may be done d
                               ability of a co                         during component testing
                                                                                              g
o Acce             ing
       eptance testi of a new functional en              t
                                             nhancement may come b                  m
                                                                       before system testing

Typical f           eptance testi include th following:
        forms of acce           ing        he

User accceptance te  esting
Typically verifies the fitness for use of the sys
        y                                       stem by business users.

       onal (accept
Operatio                        ng
                    tance) testin
       eptance of th system by the system administrato including
The acce           he           y                    ors,      g:
o Test             up/restore
       ting of backu
o Disaaster recoverry
       r
o User manageme    ent
o Main ntenance tassks
       a
o Data load and m              ks
                  migration task
o Perioodic checks of security vulnerabilities
                                             s

       ct
Contrac and regula  ation accept            ng
                                 tance testin
        t           e
Contract acceptance testing is peerformed aga             act’s accepta
                                             ainst a contra           ance criteria for producing
custom-ddeveloped so oftware. Acc
                                ceptance crite should be defined wh the parti agree to the
                                             eria        b            hen          ies          t
        .                                   erformed aga
contract. Regulation acceptance testing is pe                         ulations that must be adh
                                                         ainst any regu                        hered
to, such as governme legal or safety regula
                     ent,                    ations.

Alpha and beta (or field) testing   g
Developers of marke or COTS, software ofte want to get feedback from potentia or existing
                       et,                       en                                    al           g
custome in their ma
        ers                                     e             p
                        arket before the software product is put up for sale commercia              esting
                                                                                       ally. Alpha te
is performed at the d developing or rganization’s site but not by the developing team. Beta testing or
                                                s                                                   g,
field-test             rmed by cust
         ting, is perfor            tomers or po             omers at their own locatio
                                                otential custo                        ons.

Organiza            use        ms
         ations may u other term as well, s
                                          such as facto acceptanc testing an site accep
                                                      ory          ce         nd      ptance
         or         hat                   d                                   site.
testing fo systems th are tested before and after being moved to a customer’s s




        2011
Version 2                                              Page 27 of 78                              ar-2011
                                                                                              31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                            International
      fied Teste
 Certif        er                                                                         Software Te esting
      ation Level Sy
 Founda            yllabus                                                               Qualifications Board
                                                                                         Q            s




2.3                  ypes (K2
               Test Ty      2)                                                                   tes
                                                                                          40 minut

Terms
        ox            ode        e,
Black-bo testing, co coverage functional testing, inter     roperability te             testing,
                                                                           esting, load t
maintain               g,        nce
        nability testing performan testing, p               sting, reliabili testing, se
                                              portability tes              ity          ecurity testing,
        esting, structu testing, u
stress te              ural                   ting, white-bo testing
                                 usability test             ox

     round
Backgr
                                  aimed at veri
A group of test activities can be a                        ftware system (or a part o a system) based
                                              ifying the sof           m            of
on a spe
       ecific reason or target for testing.

         pe           d
A test typ is focused on a particcular test obje            h           ny           owing:
                                                ective, which could be an of the follo
o A fun                           y
         nction to be performed by the software
o A no  on-functional quality characteristic, su as reliabi or usability
                                               uch           ility
o The structure or architecture of the softwa or system
                                                are         m
o Change related, i.e., confirmi that defe
                                  ing                       en
                                               ects have bee fixed (con              sting) and loo
                                                                        nfirmation tes            oking
    for u
        unintended ch hanges (regrression testinng)

A model of the software may be d developed and/or used in structural te
                                                            n                                      w
                                                                        esting (e.g., a control flow
model or menu struc
         r         cture model), non-function testing (e
                                              nal                       ance model, usability mo
                                                           e.g., performa                          odel
security threat modeling), and fun                                      model, a stat transition model
                                 nctional testing (e.g., a process flow m             te
or a plain language s
         n                       ).
                    specification)

2.3.1                  of      on       onal Testi
               Testing o Functio (Functio        ing) (K2)
       ctions that a system, subs
The func                        system or co            e
                                             omponent are to perform may be described in work
       s
products such as a re          s
                     equirements specification use cases or a functio
                                             n,        s,                        ation, or they may
                                                                    onal specifica            y
be undoc            The        s             the
        cumented. T functions are “what” t system does.d

Function tests are based on fun
        nal                      nctions and f            scribed in do
                                              features (des           ocuments or u            b
                                                                                   understood by the
         and        eroperability with specific systems, an may be pe
testers) a their inte                         c           nd                       all         s
                                                                       erformed at a test levels (e.g.,
                    s            sed
tests for components may be bas on a com      mponent specification).

Specifica
        ation-based ttechniques m be used to derive tes conditions and test cas from the
                                may                     st          s            ses
        ality
functiona of the so oftware or sy
                                ystem (see C
                                           Chapter 4). Fu
                                                        unctional tes           ers        nal
                                                                    sting conside the extern
        r           ware (black-b testing).
behavior of the softw           box

A type of functional testing, security testing, in
        f                                                     t           s             wall) relating to
                                                 nvestigates the functions (e.g., a firew            g
detection of threats, such as virus
        n                         ses, from ma                            er
                                                 alicious outsiders. Anothe type of fun nctional testing,
        rability testin evaluates the capabili of the soft
interoper             ng,        s               ity          tware produc to interact with one or more
                                                                          ct
        d              ts
specified component or systems    s.

              of      nctional S
2.3.2 Testing o Non-fun                 C         stics (Non
                               Software Characteris        n-functional
      g)
Testing (K2)
Non-func                           but
         ctional testing includes, b is not limited to, perfo ormance testing, load testing, stress
         usability testing, maintain
testing, u                                       ng,           t           portability tes
                                   nability testin reliability testing and p                            e
                                                                                         sting. It is the
        of
testing o “how” the s system works s.

Non-func                         erformed at a test levels. The term non-functiona testing des
         ctional testing may be pe             all                                 al           scribes
        s
the tests required to measure cha aracteristics of systems and software that can be quantified on a
                                                           a           e                        o
        scale, such a response times for per
varying s             as                                   esting. These tests can be referenced to a
                                                rformance te           e                        d
        model such a the one de
quality m             as         efined in ‘Sofftware Engine            tware Product Quality’ (IS
                                                            eering – Soft                        SO

        2011
Version 2                                              Page 28 of 78                                  ar-2011
                                                                                                  31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s


9126). N            al         nsiders the external beha
       Non-functiona testing con                                     oftware and in most case
                                                       avior of the so                      es
       ack-box test d
uses bla                       niques to acc
                    design techn                       t.
                                           complish that

2.3.3                  of      re      ure/Archite
               Testing o Softwar Structu                   ructural Testing) (K
                                                 ecture (Str                  K2)
        al         x)           y          med           t                      iques are best
Structura (white-box testing may be perform at all test levels. Structural techni
        er                                               p          he
used afte specification-based techniques, in order to help measure th thoroughn              ng
                                                                                ness of testin
through assessment of coverage of a type of structure.
                               e

Coverag is the exte that a stru
         ge          ent                     een         ed
                                ucture has be exercise by a test s  suite, expressed as a
         age
percenta of the ite             overed. If cov
                     ems being co            verage is not 100%, then more tests m be desig
                                                                                 may      gned
         hose items th were miss to increa coverage Coverage techniques a covered in
to test th           hat         sed         ase         e.                      are
Chapter 4.

At all tes levels, but especially in component testing and component integration te
         st                        n            t                                   esting, tools can
be used to measure the code cov    verage of ele           h          ents or decisions. Structural
                                                ements, such as stateme
testing m be based on the arch
         may          d                          he        s           lling hierarch
                                   hitecture of th system, such as a cal            hy.

Structura testing app
         al                       n                        em,      integration or acceptance
                       proaches can also be applied at syste system i                       e
         evels (e.g., to business m
testing le             o                     enu
                                  models or me structure  es).

2.3.4                  Related to Changes Re-testing and Re
               Testing R        o       s:                egression Testing (K2)

        defect is dete
After a d                         ed,
                     ected and fixe the softw  ware should be re-tested t confirm th the origina
                                                            b           to         hat           al
         as
defect ha been succ               moved. This i called confirmation. De
                      cessfully rem             is                     ebugging (loc             xing a
                                                                                    cating and fix
         s
defect) is a developm                           g
                     ment activity, not a testing activity.

Regress sion testing is the repeate testing of an already te
                      s           ed                                  am,
                                                           ested progra after mod   dification, to
         r           s            or           d
discover any defects introduced o uncovered as a result of the chang  ge(s). These defects may be  y
                      e                        her        o
either in the software being tested, or in anoth related or unrelated ssoftware commponent. It is  s
performe when the software, or its environm
         ed                                   ment, is changed. The ext             ssion testing is
                                                                       tent of regres             g
based on the risk of not finding defects in soft
          n                                                           previously.
                                              ftware that was working p

Tests sh            eatable if the are to be u
        hould be repe            ey                      firmation test
                                             used for conf                                     sion
                                                                      ting and to assist regress
testing.

Regress sion testing m be performed at all te levels, an includes functional, no
                     may                      est        nd                    on-functional and
        al
structura testing. Re            st           run        mes
                     egression tes suites are r many tim and gene              e
                                                                   erally evolve slowly, so
         on
regressio testing is a strong can             utomation.
                                 ndidate for au




        2011
Version 2                                              Page 29 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s




2.4                        Testing (
               Maintenance T       (K2)                                                    tes
                                                                                    15 minut

Terms
       analysis, maintenance tes
Impact a                       sting

     round
Backgr
Once deeployed, a so             em           n             y                         g
                    oftware syste is often in service for years or decades. During this time the
                    ation data, or its environm
system, its configura                         ment are often corrected, changed or extended. Th
                                                            n                                     he
planning of releases in advance i crucial for successful maintenance testing. A distinction has to be
       g                          is                        m            e                        s
made be etween plann releases and hot fixe Maintenan testing is done on an existing
                    ned                       es.           nce           s           n
                     and
operational system, a is triggered by modif                 gration, or retirement of th software or
                                               fications, mig                          he
system.

Modifica            e          nhancement c
       ations include planned en                        g.,
                                             changes (e.g release-baased), correcctive and
       ncy                      es
emergen changes, and change of environ                  a
                                            nment, such as planned o            stem or database
                                                                    operating sys
upgrades, planned upgrade of Co ommercial-OOff-The-Shelf software, or patches to correct newly
                                                        f           r
exposed or discovere vulnerabilities of the o
       d            ed                                  stem.
                                            operating sys

Maintena ance testing for migration (e.g., from one platform to another) should inclu operation
                                  n                        m                         ude         nal
         the                      well         e
tests of t new environment as w as of the changed so                                 g           n
                                                            oftware. Migration testing (conversion
         is          ed           a
testing) i also neede when data from anoth applicatio will be mig
                                               her         on                        he
                                                                       grated into th system be  eing
maintainned.

Maintena                          ement of a sy
        ance testing for the retire                       nclude the te
                                               ystem may in           esting of data migration or
                                                                                   a
        g            a-retention pe
archiving if long data            eriods are required.

In additio to testing what has be changed, maintenanc testing inc
          on                      een                       ce         cludes regression testing to
                                                                                                g
          the                     t
parts of t system that have not been chang                 ope
                                              ged. The sco of mainte               ng            t
                                                                       enance testin is related to the
         he          he           e
risk of th change, th size of the existing sysstem and to the size of th change. D
                                                            t          he                       n
                                                                                   Depending on the
changes maintenanc testing may be done a any or all test levels an for any or all test types.
         s,           ce                      at            t          nd          r
Determin             e            stem may be affected by changes is c
          ning how the existing sys                                                 t           nd
                                                                       called impact analysis, an is
used to h             how much re
          help decide h                       sting to do. The impact analysis may be used to
                                  egression tes            T
         ne
determin the regres               ite.
                     ssion test sui

Maintena                        cult        cations are out of date or missing, or testers with
       ance testing can be diffic if specific           o            r
       knowledge a not availa
domain k          are           able.

     ences
Refere
       MMI, Craig, 2
2.1.3 CM                        l,          E
                    2002, Hetzel 1988, IEEE 12207
       zel,
2.2 Hetz 1988
       opeland, 200 Myers, 19
2.2.4 Co            04,        979
       eizer, 1990, B
2.3.1 Be            Black, 2001, Copeland, 22004
                    SO
2.3.2 Black, 2001, IS 9126
2.3.3 Be            Copeland, 20
       eizer, 1990, C           004, Hetzel, 1988
       etzel, 1988, I
2.3.4 He            IEEE STD 82 29-1998
       ck,          aig,
2.4 Blac 2001, Cra 2002, He                 IEEE STD 82
                               etzel, 1988, I         29-1998




        2011
Version 2                                              Page 30 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                              International
      fied Teste
 Certif        er                                                                           Software Te esting
      ation Level Sy
 Founda            yllabus                                                                 Qualifications Board
                                                                                           Q            s




3.           S      Techniques (K2
             Static T            2)                                                         60    tes
                                                                                            6 minut
     ing   ctives for Static Te
Learni Objec        r         echniques
       ectives identify what you will be able t do followin the compl
The obje                                      to          ng                     h
                                                                    letion of each module.

      atic           d                  (
3.1 Sta Techniques and the Test Process (K2)
       1
LO-3.1.1                                             cts          b
                   Recognize software work produc that can be examined by the differ        rent static
                   techniqu (K1)
                           ues
       2
LO-3.1.2                   e
                   Describe the importa ance and valu of conside
                                                      ue                         echniques fo the assess
                                                                   ering static te          or          sment
                           are
                   of softwa work products (K2)
       3
LO-3.1.3                    the
                   Explain t differenc between s
                                       ce                         namic techniques, consid
                                                     static and dyn                                     tives,
                                                                                            dering object
                   types of defects to be identified, a the role of these tech
                                         e            and                                   hin         are
                                                                                 hniques with the softwa life
                   cycle (K22)

      view Proc
3.2 Rev       cess (K2)
       1
LO-3.2.1                   he                                     s                         ew
                   Recall th activities, roles and responsibilities of a typical formal revie (K1)
       2
LO-3.2.2                   the
                   Explain t differenc between different type of reviews informal re
                                       ces                        es            s:                      ical
                                                                                            eview, techni
                           walkthrough and inspection (K2)
                   review, w
       3
LO-3.2.3           Explain t factors fo successful performanc of reviews (K2)
                           the          or                        ce           s

      atic   sis    ols
3.3 Sta Analys by Too (K2)
       1
LO-3.3.1           Recall tyypical defects and errors identified by static analys and comp
                                         s                        y             sis          pare them too
                                         c
                   reviews and dynamic testing (K1)
       2
LO-3.3.2                    e,
                   Describe using exam               ypical benefits of static an
                                         mples, the ty                          nalysis (K2)
       3
LO-3.3.3                    cal
                   List typic code and design defe                y             d
                                                     ects that may be identified by static annalysis tools (K1)




        2011
Version 2                                               Page 31 of 78                                   ar-2011
                                                                                                    31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s




           Techniques and the Tes Proce
3.1 Static T            d       st    ess                                                   tes
                                                                                     15 minut
(K2)
Terms
      c             tic
Dynamic testing, stat testing

Backgr
     round
Unlike dy            ng,
         ynamic testin which req             xecution of so
                                 quires the ex                                      chniques rely on
                                                          oftware, static testing tec           y
         ual
the manu examinat                s)
                     tion (reviews and autom              is
                                            mated analysi (static ana                           er
                                                                      alysis) of the code or othe
        documentatio without the execution of the code.
project d           on

Reviews are a way o testing soft
       s           of                      products (including code) and can be performed well
                                tware work p                        )                          w
                    execution. D
before dynamic test e                      cted during re
                               Defects detec            eviews early in the life cy             fects
                                                                                  ycle (e.g., def
found in requirement are often much cheap to remove than those detected by running test on
                    ts)                    per          e                         y             ts
the exec
       cuting code.

        w              one        as                                                 t.
A review could be do entirely a a manual activity, but there is also tool support The main
manual a                examine a w
         activity is to e         work product and make co   omments about it. Any sooftware workk
         can                       ng
product c be reviewed, includin requireme                    ations, desig specifications, code, te
                                                ents specifica           gn                       est
        est
plans, te specifications, test cas              ipts, user guides or web pages.
                                   ses, test scri

Benefits of reviews in                       ction and cor
                     nclude early defect detec                        elopment pro
                                                         rrection, deve          oductivity
improvem             ced
         ments, reduc developm   ment timesca             d           st          ifetime cost
                                             ales, reduced testing cos and time, li
         ns,
reduction fewer def  fects and improved comm             R           n
                                             munication. Reviews can find omissioons, for exammple,
         ements, whic are unlike to be foun in dynamic testing.
in require           ch          ely         nd

Reviews static analy and dyna
        s,           ysis       amic testing have the same objective – identifying defects. Th
                                                                     e            g           hey
are complementary; the different techniques can find diffe           of           ectively and
                                                         erent types o defects effe
efficiently. Compared to dynamic testing, stat technique find causes of failures (defects) rather
                     d         c             tic        es
than the failures them
                     mselves.

Typical d
        defects that a easier to find in reviews than in dynamic testin include: d
                     are                                              ng                        om
                                                                                   deviations fro
        ds,          ent
standard requireme defects, d   design defec insufficie maintaina
                                             cts,        ent                       correct interfa
                                                                     ability and inc             ace
        ations.
specifica




        2011
Version 2                                              Page 32 of 78                             ar-2011
                                                                                             31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s




3.2                 w
               Review Proces (K2)
                           ss                                                              tes
                                                                                    25 minut

Terms
                                   rmal review, inspection, metric, mode
Entry criteria, formal review, infor                        m                       review, review
                                                                       erator, peer r            wer,
scribe, te
         echnical review, walkthro ough

     round
Backgr
        erent types o reviews vary from inform characterized by no written instr
The diffe           of                          mal,                    o            ructions for
reviewer to systematic, charact
        rs,                                     am
                                   terized by tea participattion, docume             s             w,
                                                                         ented results of the review and
documen  nted proceduures for condducting the re            ormality of a review proce is related to
                                                eview. The fo                         ess          d
        such as the m
factors s            maturity of the developme process, any legal or regulatory re
                                                ent                                               s
                                                                                      equirements or the
        r            il.
need for an audit trai

The way a review is carried out d
       y                        depends on t agreed objectives of t review (e
                                           the                    the                      ects,
                                                                             e.g., find defe
       derstanding, educate test
gain und                                   w         bers, or discu
                                ters and new team memb                       decision by
                                                                  ussion and d
consenssus).

3.2.1                   s         mal   ew
               Activities of a Form Revie (K1)
        l            ew         ollowing main activities:
A typical formal revie has the fo           n

1. Plan
      nning
   • D Defining the review criterria
   • S              e
       Selecting the personnel
   • A             oles
       Allocating ro
   • D                           xit          r           al          es
       Defining the entry and ex criteria for more forma review type (e.g., insp  pections)
   • S Selecting wh                           to
                   hich parts of documents t review
   • C             ntry         for          mal
       Checking en criteria (f more form review types)
      k-off
2. Kick
   • D              documents
       Distributing d
   • E              he          s,           nd           ts
       Explaining th objectives process an document to the participants
       vidual prepar
3. Indiv            ration
   • P Preparing for the review meeting by r              e
                                              reviewing the document(s)
   • N Noting poten                           nd
                   ntial defects, questions an comment    ts
4. Exammination/evaaluation/recor            ults        m
                                 rding of resu (review meeting)
   • D             or                                    o           or          mal
       Discussing o logging, with documented results or minutes (fo more form review typ     pes)
   • N             cts,
       Noting defec making re    ecommenda                                                   cisions
                                             ations regarding handling the defects, making dec
       about the de
       a           efects
   • E Examining/e                                         g         cal
                   evaluating and recording issues during any physic meetings or tracking anya
       g
       group electroonic commun  nications
5. Rewwork
   • F              ts
       Fixing defect found (typ              by
                                pically done b the author r)
   • R Recording updated status of defects (in formal reviews)
       ow-up
6. Follo
   • C              at          ave
       Checking tha defects ha been add      dressed
   • G Gathering metrics
   • C             n
       Checking on exit criteria (for more for             t
                                              rmal review types)

3.2.2                 nd     nsibilities (K1)
               Roles an Respon
        l            ew          de
A typical formal revie will includ the roles bbelow:
                     es
o Manager: decide on the exe                  views, alloca
                                 ecution of rev                        project sched
                                                          ates time in p           dules and
    dete             e
       ermines if the review objeectives have been met.
        2011
Version 2                                              Page 33 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                           International
      fied Teste
 Certif        er                                                                        Software Te esting
      ation Level Sy
 Founda            yllabus                                                              Qualifications Board
                                                                                        Q            s


o     Moderator: the p person who leeads the review of the do               set
                                                                ocument or s of docume                 ng
                                                                                          ents, includin
      planning the revi iew, running the meeting, and followin               he                       y,
                                                                ng-up after th meeting. If necessary the
      moderator may m  mediate betw                            o             s
                                    ween the various points of view and is often the pe                w
                                                                                          erson upon whom
      the s            he
          success of th review res  sts.
o         hor:                      with
      Auth the writer or person w chief res                     or
                                                  sponsibility fo the docum ment(s) to be reviewed.
o     Revi             viduals with a specific technical or bus
           iewers: indiv                                        siness backgground (also called check kers or
      inspeectors) who, after the necessary prep  paration, iden            scribe finding (e.g., defe
                                                                ntify and des            gs           ects) in
          product unde review. Re
      the p            er           eviewers sho                en           ent
                                                  ould be chose to represe different perspectives and s
          s            ew           and            ake
      roles in the revie process, a should ta part in any review me          eetings.
o         be
      Scrib (or record der): docume               ssues, proble
                                    ents all the is            ems and open points that were identif
                                                                                         t             fied
          ng
      durin the meetin  ng.

Looking at software p           related work products fro different p
                    products or r                         om        perspectives and using
checklist can make reviews mor effective a efficient. For example a checklist based on various
        ts                      re           and                     e,            t         v
perspect            s           tainer, tester or operation or a chec
        tives such as user, maint            r            ns,       cklist of typica requirements
                                                                                   al
        s           o
problems may help to uncover pr reviously unddetected issuues.

3.2.3                 f
               Types of Reviews (K2)
A single software pro              ted
                     oduct or relat work pro              e                          n          w.
                                               oduct may be the subject of more than one review If
more tha one type o review is u
        an           of                        der        y.                       mal
                                  used, the ord may vary For example, an inform review ma be     ay
        out
carried o before a t technical rev             nspection ma be carried out on a req
                                  view, or an in          ay                         quirements
        ation before a walkthroug with customers. The main characte
specifica                        gh                       m                          ons
                                                                       eristics, optio and purp poses
       mon
of comm review ty   ypes are:

Informal Review
o No fo           ss
       ormal proces
                   m          ogramming o a technical lead review
o May take the form of pair pro         or                      wing designs and code
o Resu may be d
       ults        documented
       es
o Varie in usefuln             ing
                  ness dependi on the reeviewers
       n
o Main purpose: in            way
                  nexpensive w to get so ome benefit

      rough
Walkthr
o Mee              author
      eting led by a
                    m                        ,
o May take the form of scenarios, dry runs, peer group participation n
o Open-ended ses   ssions
   • O Optional pre-meeting pre              reviewers
                                eparation of r
   • O                                      ort
       Optional preparation of a review repo including list of findinggs
o Optio                         he
       onal scribe (who is not th author)
o May vary in prac ctice from qui informal to very forma
                                ite                      al
o Main purposes: learning, gaining unders
      n                                                  ding defects
                                            standing, find

       cal
Technic Review
o Docu  umented, deefined defect--detection pr
                                             rocess that in
                                                          ncludes peer and techni
                                                                        rs                          w
                                                                                       ical experts with
       onal manage
   optio            ement particippation
o May be performe as a peer review witho managem
                    ed                       out          ment participation
o Idea led by trained modera (not the a
       ally                      ator         author)
o Pre-meeting prep               reviewers
                     paration by r
o Optio             checklists
        onal use of c
o Prep paration of a review repor which inclu
                                  rt         udes the list of findings, t verdict whether the
                                                                        the
       ware product meets its re
   softw             t           equirements and, where appropriate, recommendations relate to
                                                           a                                       ed
   findings
o May vary in prac                ite
                    ctice from qui informal to very forma al
o Main purposes: discussing, making decis
       n                                                  ating alternat
                                              sions, evalua                           g
                                                                         tives, finding defects, sollving
   technical problem and chec
                    ms          cking conformmance to spe               plans, regula
                                                          ecifications, p             ations, and
   standards
        2011
Version 2                                              Page 34 of 78                                 ar-2011
                                                                                                 31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                     International
      fied Teste
 Certif        er                                                                  Software Te esting
      ation Level Sy
 Founda            yllabus                                                        Qualifications Board
                                                                                  Q            s



Inspecti
       ion
o Led by trained m              ot
                   moderator (no the author)
o Usua conducte as a peer examination
       ally         ed                      n
       ned
o Defin roles
o Incluudes metrics gathering
      mal
o Form process b     based on rules and checcklists
o Spec              and         ria
       cified entry a exit criter for accep                          oduct
                                            ptance of the software pro
o Pre-meeting prep   paration
o Inspection report including lis of findings
                     t          st
o Form follow-up process (wi optional p
      mal           p           ith         process improovement com mponents)
o Optioonal reader
       n
o Main purpose: fin             s
                    nding defects

Walkthro  oughs, technical reviews and inspecti           p         within a peer g
                                              ions can be performed w             group,
          eagues at the same organizational lev This type of review is called a “pe review”.
i.e., colle           e                       vel.        e          s            eer

3.2.4                s         for    ws
               Success Factors f Review (K2)
       s
Success factors for rreviews include:
       h             s
o Each review has clear predef                  ves
                                   fined objectiv
                                  ew           s
o The right people for the revie objectives are involved     d
o Test                ed          s
       ters are value reviewers who contrib    bute to the re
                                                            eview and als learn abou the produc
                                                                        so            ut          ct
       ch
   whic enables th                are
                     hem to prepa tests earlier
o Defe ects found ar welcomed and express objective
                      re                        sed         ely
        ple           nd
o Peop issues an psycholog        gical aspects are dealt with (e.g., mak
                                               s                                                  nce
                                                                        king it a positive experien for
       author)
   the a
o The review is conducted in an atmospher of trust; th outcome w not be us for the
                                                re          he          will         sed
   evaluation of the participants
                     e            s
o Revi               ues           ed
        iew techniqu are applie that are s    suitable to ac            bjectives and to the type and
                                                            chieve the ob                          a
                                  cts
   level of software work produc and revie    ewers
o Chec                es                        e           e            ss
        cklists or role are used if appropriate to increase effectivenes of defect identification n
o Trainning is given in review techniques, es                           l
                                               specially the more formal techniques such as
   inspeection
o Management sup                  od
                       pports a goo review pro              b           ting adequate time for rev
                                               ocess (e.g., by incorporat             e            view
       vities in proje schedules
   activ             ect           s)
       re
o Ther is an emphasis on lear      rning and proocess improv vement




        2011
Version 2                                              Page 35 of 78                           ar-2011
                                                                                           31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s




3.3            Static A      s
                      Analysis by Too (K2)
                                    ols                                                      tes
                                                                                      20 minut

Terms
                   y,           w,                       sis
Compiler, complexity control flow data flow, static analys

     round
Backgr
The obje               ic          s                          are        ode
         ective of stati analysis is to find defects in softwa source co and softw    ware models  s.
Static an             rformed witho actually e
         nalysis is per             out          executing the software be
                                                               e                      ed
                                                                          eing examine by the too   ol;
         c             s            e
dynamic testing does execute the software co                              locate defect that are ha to
                                                 ode. Static analysis can l           ts            ard
find in dy            ng.
          ynamic testin As with re  eviews, static analysis fin defects r
                                                 c            nds                     ailures. Static
                                                                         rather than fa             c
                      ze
analysis tools analyz program c    code (e.g., co             nd         ),           generated ou
                                                 ontrol flow an data flow) as well as g             utput
such as HTML and X    XML.

The valu of static an
       ue           nalysis is:
        y           of
o Early detection o defects prior to test exe  ecution
        y
o Early warning ab                ous
                    bout suspicio aspects o the code or design by t calculatio of metrics such
                                               of           o           the         on  s,
   as a high comple exity measur  re
o Identification of d
                    defects not e              by            t
                                  easily found b dynamic testing
o Dete ecting dependencies and inconsistenc                 ware models s
                                                cies in softw                       s
                                                                        such as links
o Impr roved mainta               code and des
                    ainability of c            sign
o Prev                             ons         ned
       vention of defects, if lesso are learn in develo      opment

Typical d
        defects disco             atic
                     overed by sta analysis tools include  e:
o Refe  erencing a va             an         d
                     ariable with a undefined value
                                  een        s
o Inconsistent interfaces betwe modules and compon          nents
o Varia              re           or
        ables that ar not used o are improp  perly declared d
o Unre  eachable (deead) code
o Miss              oneous logic (potentially infinite loops)
        sing and erro
                      ted
o Overly complicat construct       ts
o Prog  gramming sta andards violaations
o Secu  urity vulnerabbilities
        tax          s            d
o Synt violations of code and software m     models

Static an                                     velopers (che
        nalysis tools are typically used by dev                                   d
                                                          ecking against predefined rules or
programming standa                 and                    nd
                     ards) before a during component an integration testing or w  when checking-in
code to c            n            ent         d
         configuration manageme tools, and by designers during sof    ftware modelling. Static
                                   ge         of
analysis tools may produce a larg number o warning me                             be
                                                          essages, which need to b well-mana   aged
to allow the most effe             f
                     ective use of the tool.

Compilers may offer some suppo for static a
                             ort                       luding the ca
                                          analysis, incl                         metrics.
                                                                   alculation of m

     ences
Refere
       E
3.2 IEEE 1028
       ilb,      n         al,
3.2.2 Gi 1993, van Veenendaa 2004
       ilb,      EE
3.2.4 Gi 1993, IEE 1028
                 l,
3.3 van Veenendaal 2004




        2011
Version 2                                              Page 36 of 78                              ar-2011
                                                                                              31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                              International
      fied Teste
 Certif        er                                                                           Software Te esting
      ation Level Sy
 Founda            yllabus                                                                 Qualifications Board
                                                                                           Q            s




4.           Test De
             T     esign Te
                          echniqu (K4)
                                ues  )                                                     85   utes
                                                                                          28 minu
     ing   ctives for Test Des
Learni Objec        r                hniques
                             sign Tech
       ectives identify what you will be able t do followin the compl
The obje                                      to          ng                     h
                                                                    letion of each module.

      e        velopment Process (K3)
4.1 The Test Dev       t
       1
LO-4.1.1           Different             n             gn
                            tiate between a test desig specificat                               on
                                                                   tion, test case specificatio and test
                                         ion
                   procedure specificati (K2)
       2
LO-4.1.2                    e
                   Compare the terms t                n,           a
                                         test condition test case and test proc cedure (K2)
       3
LO-4.1.3                    e                         s
                   Evaluate the quality of test cases in terms of clear traceab                 equirements and
                                                                                 bility to the re         s
                            d
                   expected results (K2 2)
       4
LO-4.1.4           Translate test cases into a well-s
                             e                                     st                           n
                                                      structured tes procedure specification at a level ofo
                            levant to the knowledge o the testers (K3)
                   detail rel                          of          s

      tegories o Test Des
4.2 Cat        of               hniques (K
                        sign Tech        K2)
       1
LO-4.2.1           Recall re            both specification-based (black-box) a structure
                            easons that b                                    and         e-based (white-
                            t
                   box) test design techhniques are u             st         on         es
                                                     useful and lis the commo technique for each (K  K1)
       2
LO-4.2.2                    the
                   Explain t characteristics, comm  monalities, an difference between s
                                                                  nd         es                      -based
                                                                                        specification-
                            structure-bas testing a experienc
                   testing, s           sed         and                      sting (K2)
                                                                  ce-based tes

      ecification
4.3 Spe                           ox
                n-based or Black-bo Techniques (K3)
       1
LO-4.3.1                   st
                   Write tes cases from given softw
                                       m            ware models using equiva               ioning, bound
                                                                              alence partiti           dary
                   value an             sion tables an state transition diagra
                           nalysis, decis            nd                      ams/tables (KK3)
       2
LO-4.3.2                    the
                   Explain t main pur   rpose of each of the four testing techn
                                                     h                        niques, what level and ty of
                                                                                           t          ype
                           could use the technique, a how cov
                   testing c            e            and                      be
                                                                verage may b measured (K2)d
       3
LO-4.3.3                    the
                   Explain t concept of use case testing and its benefits (K  K2)

      ructure-ba
4.4 Str                 hite-box T
               ased or Wh        Techniques (K4)
       1
LO-4.4.1                    e           pt         of
                   Describe the concep and value o code cove  erage (K2)
       2
LO-4.4.2           Explain t concepts of statemen and decision coverage and give re
                             the        s          nt                    e,                      t
                                                                                     easons why these
                             s           e         st         er                     g
                   concepts can also be used at tes levels othe than component testing (e.g., on
                             s          s
                   business procedures at system leevel) (K2)
       3
LO-4.4.3           Write tes cases from given contr flows usin statement and decision test design
                             st         m          rol        ng          t                      n
                            ues
                   techniqu (K3)
       4
LO-4.4.4           Assess s  statement an decision c
                                        nd                                ss         ect         d
                                                   coverage for completenes with respe to defined exit
                             (K4)
                   criteria. (

4.5 Exp        based Tec
      perience-b                (K2)
                       chniques (
       1
LO-4.5.1                   easons for w
                   Recall re          writing test ca
                                                    ases based on intuition, e
                                                                o            experience an knowledg
                                                                                          nd        ge
                   about co            cts
                           ommon defec (K1)
       2
LO-4.5.2                   e
                   Compare experience  e-based tech                          n-based testin technique (K2)
                                                    hniques with specification            ng        es

      oosing Te Techni
4.6 Cho       est             )
                     iques (K2)
       1
LO-4.6.1           Classify test design t
                                        techniques a
                                                   according to their fitness t a given co
                                                                t             to         ontext, for the test
                                                                                                       e
                   basis, re
                           espective mo            ftware charac
                                       odels and sof                         2)
                                                                cteristics (K2




        2011
Version 2                                               Page 37 of 78                                   ar-2011
                                                                                                    31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                         International
      fied Teste
 Certif        er                                                                      Software Te esting
      ation Level Sy
 Founda            yllabus                                                            Qualifications Board
                                                                                      Q            s




4.1            The Te Deve
                    est        nt
                         elopmen Proces (K3)
                                      ss                                                      tes
                                                                                       15 minut

Terms
         se          on,       gn,       cution schedule, test proc
Test cas specificatio test desig test exec                                   ification, test
                                                                  cedure speci             t
          aceability
script, tra

     round
Backgr
                       nt
The test developmen process de                    his          an
                                    escribed in th section ca be done in different w             ery
                                                                                     ways, from ve
informal with little or no documen                            s             bed       The
                                    ntation, to very formal (as it is describ below). T level of
formality depends on the context of the testin including the maturity of testing an development
        y              n            t            ng,                                  nd
        es,
processe time cons                  ety
                       straints, safe or regulat              ments, and th people inv
                                                  tory requirem             he        volved.

During te analysis, the test basis document
           est                                                           er            ne
                                               tation is analyzed in orde to determin what to te est,
          dentify the tes conditions A test cond
i.e., to id             st         s.                        ned                       hat
                                               dition is defin as an item or event th could be
           by           ore        es
verified b one or mo test case (e.g., a fun    nction, transa                          stic       ural
                                                             action, quality characteris or structu
element)   ).

Establish                          t            back to the specifications and require
         hing traceability from test conditions b            s           s             ements enabbles
        ective impact analysis wh requirem
both effe             t           hen         ments change and determ
                                                             e,          mining requir rements coveerage
                      ring test analysis the deta
for a set of tests. Dur                                                                 o          t
                                                ailed test approach is implemented to select the test
design te            o             on,
         echniques to use based o among o      other consideerations, the identified risks (see Chapter 5
for more on risk analysis).

During te design th test cases and test dat are create and specif
         est         he           s            ta           ed                          case consists of a
                                                                           fied. A test c           s
         put
set of inp values, e execution preeconditions, e
                                               expected res sults and exe ecution postc              efined
                                                                                        conditions, de
                     st           s)
to cover a certain tes objective(s or test conndition(s). The ‘Standard for Software Test
Docume               EE
        entation’ (IEE STD 829-1998) descri    ibes the cont tent of test design specifi ications
         ing
(containi test cond                           ecifications.
                     ditions) and test case spe

Expected results sho
       d           ould be produ uced as part of the specification of a test case and include outputs,
changes to data and states, and any other co
      s                                     onsequences of the test. If expected r
                                                         s                          results have not
       fined, then a plausible, but erroneous result may be interpret as the co
been def                                    s,           y             ted         orrect one.
       d           ould ideally b defined pr to test ex
Expected results sho            be          rior         xecution.

During te implemen
         est          ntation the te cases are developed, implemente prioritized and organiz in
                                   est            e                     ed,           d          zed
         procedure sp
the test p                                                  he
                      pecification (IEEE STD 829-1998). Th test proce   edure specifies the sequeence
        ns
of action for the exeecution of a ttest. If tests a run using a test execution tool, the sequence of
                                                  are       g
actions is specified in a test scrip (which is an automated test procedure).
                      n            pt                       d

The vario test proc
         ous       cedures and automated t                             ently formed into a test
                                               test scripts are subseque
        on
executio schedule t              the
                   that defines t order in w                            ocedures, an possibly
                                               which the various test pro          nd
         ed         ts,
automate test script are execu   uted. The test execution schedule will take into account such
factors a regression tests, prioritization, and technical an logical dependencies.
        as         n                                        nd




        2011
Version 2                                              Page 38 of 78                               ar-2011
                                                                                               31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                         International
      fied Teste
 Certif        er                                                                      Software Te esting
      ation Level Sy
 Founda            yllabus                                                            Qualifications Board
                                                                                      Q            s




4.2 Catego
         ories of T             chnique
                  Test Design Tec     es                                                      tes
                                                                                       15 minut
(K2)
Terms
       ox          n                      -based test design techni
Black-bo test design technique, experience-           d                        sign techniqu
                                                                  ique, test des           ue,
       ox          n
white-bo test design technique

     round
Backgr
       pose of a tes design tech
The purp           st                       identify test conditions, te cases, an test data.
                               hnique is to i             c            est       nd

          assic distinction to denote test techniq
It is a cla                          e            ques as blacck-box or whiite-box. Blac             esign
                                                                                        ck-box test de
         ues
techniqu (also called specificat                  echniques) are a way to d
                                      tion-based te           a            derive and select test
condition test cases, or test dat based on an analysis of the test ba documentation. This
          ns,                        ta                        o          asis                       s
includes both functio  onal and non-              esting. Black
                                     -functional te                                                  u
                                                              k-box testing, by definition, does not use
any infor rmation rega               ernal structure of the com
                       arding the inte                        mponent or ssystem to be tested. Whit  te-box
                        es           ed
test design technique (also calle structural or structure-    -based technniques) are b based on an
analysis of the struct               omponent or system. Black-box and w
                        ture of the co                                    white-box tes              so
                                                                                         sting may als be
combine with exper
         ed             rience-based techniques to leverage the experien of develo
                                     d                                    nce                        s
                                                                                        opers, testers and
users to determine w                 be
                       what should b tested.

Some techniques fall clearly into a single cate           s          ents of more than one
                                              egory; others have eleme
category
       y.

This sylla           to
         abus refers t specificatio
                                  on-based tes design tec
                                             st          chniques as b            chniques and
                                                                      black-box tec          d
        e-based test design techn
structure                        niques as wh                         ddition exper
                                             hite-box techniques. In ad                      d
                                                                                  rience-based test
design te            re
         echniques ar covered.

Common characteris
       n                                      ed                       s
                    stics of specification-base test design techniques include:
o Models, either fo              rmal, are use for the spe
                    ormal or infor            ed                       f           m            d,
                                                          ecification of the problem to be solved the
   softw            omponents
       ware or its co
       t            be
o Test cases can b derived sy                 y
                                 ystematically from these models

Common characteris
      n                         ture-based te design te
                  stics of struct            est         echniques incclude:
      rmation abou how the so
o Infor           ut                                                  ve
                                oftware is constructed is used to deriv the test ca            c
                                                                                   ases (e.g., code
                  sign informati
   and detailed des             ion)
o The extent of cov             e            an
                   verage of the software ca be measu                 ting test case and furthe test
                                                         ured for exist            es,          er
   case can be derived system
      es                                     ncrease cove
                               matically to in           erage

Common characteris
      n          stics of experrience-based test design techniques include:
                                          d
                  and         nce         e           o
o The knowledge a experien of people are used to derive the t     test cases
o The knowledge o testers, de
                  of                      sers and othe stakeholde about the software, it
                              evelopers, us            er          ers         e        ts
   usag and its en
      ge                      s
                 nvironment is one source of informatioon
o Know           ut           cts                      i          ource of infor
      wledge abou likely defec and their distribution is another so            rmation




        2011
Version 2                                              Page 39 of 78                               ar-2011
                                                                                               31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                            International
      fied Teste
 Certif        er                                                                         Software Te esting
      ation Level Sy
 Founda            yllabus                                                               Qualifications Board
                                                                                         Q            s




                  based or Black-b
4.3 Specification-b      r       box                                                     150  utes
                                                                                         1 minu
Techn      (K3)
    niques (
Terms
       ry                     on           ng,       nce        ing, state transition testin use
Boundar value analysis, decisio table testin equivalen partitioni                          ng,
       sting
case tes

4.3.1                 ence Partit
               Equivale                  K3)
                                tioning (K
In equivaalence partiti             s            ware or syste are divide into group that are
                       ioning, inputs to the softw             em            ed          ps
         d
expected to exhibit s similar behavvior, so they a likely to be processed in the same way.
                                                  are           b            d
Equivale ence partition (or classes can be fou for both valid data, i.e., values that should be
                      ns            s)            und                                                  e
         d
accepted and invalid data, i.e., va alues that sh                           ons
                                                 hould be rejected. Partitio can also be identified ford
outputs, internal valu             ated values (
                      ues, time-rela                                        event) and for interface
                                                  (e.g., before or after an e
parameters (e.g., inte egrated com                ng                         ion
                                  mponents bein tested during integrati testing). T       Tests can be  e
         d             l
designed to cover all valid and in nvalid partitions. Equivale              ning is applic
                                                               ence partition            cable at all levels of
testing.

Equivale            ning can be u
       ence partition                        eve       d           erage goals. It can be ap
                                 used to achie input and output cove                       pplied
                                 ces         em,       ace         ers
to human input, input via interfac to a syste or interfa paramete in integra  ation testing.

4.3.2                ry      Analysis (K
               Boundar Value A         K3)
         r           e
Behavior at the edge of each equ                          re           e            han
                                 uivalence partition is mor likely to be incorrect th behavior within
         tion, so boun
the partit                       an          re                                      e
                     ndaries are a area wher testing is likely to yield defects. The maximum and
        m                        e           ry            b
minimum values of a partition are its boundar values. A boundary va                  id           s
                                                                       alue for a vali partition is a
        undary value the bounda of an inva partition is an invalid boundary va
valid bou            e;          ary         alid                                                 c
                                                                                     alue. Tests can be
        d            oth
designed to cover bo valid and invalid boun ndary values. When desig                ases, a test fo
                                                                        gning test ca              or
each bou             e
         undary value is chosen.

Boundar value analysis can be applied at all test levels. It is relatively easy to apply and its defect-
        ry
        capability is h
finding c                          d            ons       ful
                      high. Detailed specificatio are helpf in determi                eresting
                                                                         ining the inte
boundar ries.

This tech             en        ed
         hnique is ofte considere as an exte              uivalence partitioning or o
                                              ension of equ                                     box
                                                                                    other black-b
                      es.
test design technique It can be used on equ               asses for use input on sc
                                              uivalence cla           er            creen as well as,
       mple, on time ranges (e.g., time out, tr
for exam                                      ransactional speed requirements) or t             s
                                                                                    table ranges (e.g.,
table size is 256*2566).

4.3.3                 n                  )
               Decision Table Testing (K3)
        n                         to
Decision tables are a good way t capture sy   ystem require ements that c            cal            s,
                                                                         contain logic conditions and
to docum                                      ay             o
        ment internal system design. They ma be used to record com       mplex business rules that at
system is to impleme When cr
                     ent.                                   he           ion
                                 reating decision tables, th specificati is analyz   zed, and cond  ditions
and actio of the sy
         ons                     entified. The input conditions and actions are mos often stated in
                     ystem are ide                                                    st
such a w that they must be true or false (Boolean). The decision tab contains the triggering
        way          y                                      e            ble
         ns,
condition often com               f                         put          s,
                     mbinations of true and false for all inp conditions and the res                ns
                                                                                       sulting action for
each com                          Each column of the table corresponds to a busine rule that
         mbination of conditions. E           n             e                         ess
defines a unique com                           nd           sult
                     mbination of conditions an which res in the exe     ecution of the actions
associated with that rule. The cov            dard commonly used with decision table testing is to
                                   verage stand                          h                          s
have at l            st           n
         least one tes per column in the table which typic
                                              e,                         s            l             ns
                                                            cally involves covering all combination of
         g
triggering conditions..


        2011
Version 2                                              Page 40 of 78                                  ar-2011
                                                                                                  31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s


The strength of decis            sting is that it creates com
                    sion table tes              t                      of                       se
                                                            mbinations o conditions that otherwis
might no have been exercised during testing It may be applied to all situations w
        ot                                    g.            a                                   on
                                                                                  when the actio of
        ware depends on several logical decis
the softw                                       sions.

4.3.4                  ansition Te
               State Tra                  3)
                                 esting (K3
         m             it
A system may exhibi a different response de    epending on current cond                            y
                                                                         ditions or previous history (its
state). In this case, th aspect of the system can be show with a sta transition diagram. It allows
         n             hat         f                        wn           ate                        a
         er           e
the teste to view the software in terms of its states, trans              en                       e
                                                            sitions betwee states, the inputs or events
         ger
that trigg state cha               tions) and the actions wh
                      anges (transit                        hich may result from thos transitions The
                                                                                       se          s.
         f                         der
states of the system or object und test are s               entifiable and finite in num
                                                separate, ide            d             mber.

A state table shows t relationship between the states and inputs, an can highli
                     the                              a            nd         ight possible
transition that are in
         ns          nvalid.

Tests ca be designe to cover a typical sequ
         an          ed                       uence of states, to cover every state, to exercise every
                                                                        r             ,
          n,         e            quences of t
transition to exercise specific seq                       r                           s.
                                             transitions or to test invalid transitions

State tra             ng                       he        d
        ansition testin is much used within th embedded software in ndustry and technical
                      al.                      ue
automation in genera However, the techniqu is also suitable for mo              siness object
                                                                    odeling a bus
having s              s
        specific states or testing s
                                   screen-dialog flows (e. for Intern applicatio or busine
                                               gue       .g.,       net         ons         ess
scenarioos).

4.3.5                 e
               Use Case Testing (K2)
       an           d
Tests ca be derived from use ca                case describ interactio between actors (user or
                                   ases. A use c          bes           ons                       rs
systems), which prod               t                       er
                    duce a result of value to a system use or the cus   stomer. Use c cases may beb
describe at the abst
       ed            tract level (business use case, technoology-free, bbusiness proc             o
                                                                                      cess level) or at
        em
the syste level (sys              se
                    stem use cas on the sys               nality level). Each use ca has
                                              stem function                          ase
                                  met          se        w
preconditions which need to be m for the us case to work successf        fully. Each use case
terminate with postc
        es          conditions which are the observable results and final state of t system after
                                                           r                          the         a
        case has bee completed A use cas usually has a mainstre
the use c           en             d.         se                                                  enario
                                                                        eam (i.e., most likely) sce
        rnative scena
and alter           arios.

         es          the         s           ugh         m              s
Use case describe t “process flows” throu a system based on its actual likely use, so the test e
cases de            use                      ul
        erived from u cases are most usefu in uncover     ring defects in the proces flows durin
                                                                                    ss         ng
real-worl use of the system. Use cases are v
         ld                       e                        or
                                              very useful fo designing acceptance tests with
custome                           y          uncover integ
       er/user participation. They also help u                                      y          tion
                                                          gration defects caused by the interact
        rference of d
and inter                                                 al            t           uld
                    different components, which individua component testing wou not see.
        ng          s
Designin test cases from use ca                           w
                                 ases may be combined with other spe                ased test
                                                                       ecification-ba
       ues.
techniqu




        2011
Version 2                                              Page 41 of 78                             ar-2011
                                                                                             31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                     International
      fied Teste
 Certif        er                                                                  Software Te esting
      ation Level Sy
 Founda            yllabus                                                        Qualifications Board
                                                                                  Q            s




4.4 Structu      ed
          ure-base or Wh
                       hite-box                                                            tes
                                                                                    60 minut
Techn
    niques (K4)
Terms
      overage, deci
Code co                       ge,      nt                     ased testing
                  ision coverag statemen coverage, structure-ba

     round
Backgr
        e-based or w
Structure                                    d           tified structur of the soft
                    white-box testing is based on an ident             re          tware or the
                     he
system, as seen in th following examples:
o Com  mponent level: the structu of a softw
                                  ure        ware component, i.e., stattements, dec             ches
                                                                                   cisions, branc
        ven
    or ev distinct p paths
o Integ gration level: the structure may be a c tree (a diagram in wh
                                              call                                 s
                                                                       hich modules call other
    modules)
o Syst               e            may         nu
        tem level: the structure m be a men structure, business pr     rocess or web page struccture

In this se                         d             est
         ection, three code-related structural te design te echniques for code cover              o
                                                                                     rage, based on
         nts,
statemen branches and decisio                   cussed. For decision test
                                    ons, are disc            d                       ol          am
                                                                        ting, a contro flow diagra
may be u used to visua             ernatives for e
                      alize the alte                        on.
                                                 each decisio

4.4.1                 nt      g        erage (K4)
               Statemen Testing and Cove
In compo            g,
        onent testing statement coverage is the assessm              ercentage of executable
                                                        ment of the pe            f
        nts         e
statemen that have been exerc                st                     ment testing te
                                cised by a tes case suite. The statem             echnique derives
        es          e           atements, normally to increase statem
test case to execute specific sta                                   ment coverag ge.

Statement coverage is determine by the num
                              ed                    cutable statements cover by (desig
                                         mber of exec                        red       gned
or execu            ses       by         er         cutable statem
       uted) test cas divided b the numbe of all exec            ments in the code under test.

4.4.2                 n         and   rage (K4)
               Decision Testing a Cover
        n
Decision coverage, rrelated to bra                         ssment of the percentage of decision
                                  anch testing, is the asses           e           e
        es
outcome (e.g., the T              lse         of                       ave
                    True and Fal options o an IF statement) that ha been ex        xercised by a test
case suite. The decis             technique de
                     sion testing t                        ases to execu specific d
                                              erives test ca           ute         decision outccomes.
Branche originate fr
        es                       n
                     rom decision points in the code and show the tran
                                               e           s                        rol         nt
                                                                       nsfer of contr to differen
        s           e.
locations in the code

Decision coverage is determined by the numb of all dec
       n           s          d           ber                    mes                  ed
                                                     cision outcom covered by (designe or
       d)          s           the        of         e           utcomes in th code under
executed test cases divided by t number o all possible decision ou           he
test.

Decision testing is a form of cont flow testin as it follow a specific flow of cont through the
       n                         trol          ng         ws         c            trol        t
decision points. Deci            ge           er
                     ision coverag is stronge than statem             ge;
                                                          ment coverag 100% de                rage
                                                                                  ecision cover
guarante 100% sta
        ees          atement cove             ot         a.
                                  erage, but no vice versa

4.4.3                           ased Tech
               Other Structure-ba                K1)
                                        hniques (K
       re          evels of struc
There ar stronger le            ctural covera beyond decision cove
                                            age      d                       xample, cond
                                                                 erage, for ex          dition
       e                        coverage.
coverage and multiple condition c

The conc cept of coverage can also be applied at other test levels For example, at the integration
                                            d
level the percentage of modules, components or classes that have be exercised by a test ca
                                             s                       een          d             ase
suite cou be expres
         uld         ssed as moddule, componnent or class coverage.

Tool sup            ul           uctural testing of code.
       pport is usefu for the stru             g
        2011
Version 2                                              Page 42 of 78                           ar-2011
                                                                                           31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s




4.5            Experie
                     ence-ba      chniques (K2)
                           ased Tec      s                                                  tes
                                                                                     30 minut

Terms
                     (fault) attack
Exploratory testing, (

     round
Backgr
Experiennce-based te                          derived from the tester’s skill and intu
                    esting is where tests are d                                                   eir
                                                                                     uition and the
experien with similar applicatio and technologies. Wh used to a
        nce                     ons                       hen          augment sys   stematic
techniqu             chniques can be useful in identifying special tests not easily c
        ues, these tec           n            n                        s                          f
                                                                                     captured by formal
techniqu                                     ore
        ues, especially when applied after mo formal ap  pproaches. However, this technique may   m
        dely varying d
yield wid            degrees of effectiveness, depending on the tester experienc
                                                                       rs’           ce.

A commonly used ex  xperience-ba              ue
                                 ased techniqu is error gu uessing. Gen              rs
                                                                        nerally tester anticipate
defects b
        based on exp perience. A s            pproach to th error gues
                                  structured ap            he                        que
                                                                        ssing techniq is to
        ate
enumera a list of possible defects and to de  esign tests th attack the defects. This systematic
                                                           hat          ese
approach is called fa attack. Th
                    ault          hese defect and failure lists can be built based on experience
                                                                                     n           e,
available defect and failure data, and from co
        e                                     ommon know                t            re
                                                          wledge about why softwar fails.

Exploratory testing is concurrent test design, test executio test loggi and learn
                     s                                     on,          ing                        o
                                                                                     ning, based on a
test char containin test object
        rter         ng                       arried out within time-box
                                  tives, and ca                                      approach that is
                                                                       xes. It is an a
        eful
most use where th                              te
                     here are few or inadequat specificati             vere time pre
                                                            ions and sev                           o
                                                                                     essure, or in order
to augme or complement other more forma testing. It can serve as a check on the test proc
         ent                     r,           al           c           s                          cess,
        ensure that th most serio defects a found.
to help e            he          ous          are




        2011
Version 2                                              Page 43 of 78                             ar-2011
                                                                                             31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s




4.6            Choosi Test Techniques (K
                    ing  t             K2)                                                  tes
                                                                                     15 minut

Terms
No specific terms.

     round
Backgr
                                 ues
The choice of which test techniqu to use de  epends on a number of fa              ding the type of
                                                                      actors, includ            e
system, regulatory st
                    tandards, customer or co             quirements, level of risk, type of risk, test
                                            ontractual req
objective documenta
        e,                      ble,
                    ation availab knowledg of the test
                                            ge                        d
                                                         ters, time and budget, deevelopment li  ife
        se
cycle, us case models and prev              ence with types of defects found.
                                vious experie                         s

Some techniques are more applicable to cert
                  e                       tain situations and test levels; others are applicab to
                                                                                             ble
all test le
          evels.

When cr              cases, testers generally u a combin
        reating test c            s           use      nation of test techniques including pro
                                                                                             ocess,
                                  to
rule and data-driven techniques t ensure adequate cove              object under test.
                                                       erage of the o


     ences
Refere
       ig,
4.1 Crai 2002, Het               EEE STD 829-1998
                    tzel, 1988, IE
       zer,
4.2 Beiz 1990, Co   opeland, 200 04
       opeland, 200 Myers, 19
4.3.1 Co           04,           979
       opeland, 200 Myers, 19
4.3.2 Co           04,           979
       eizer, 1990, C
4.3.3 Be            Copeland, 20  004
       eizer, 1990, C
4.3.4 Be            Copeland, 20  004
       opeland, 200
4.3.5 Co           04
       eizer, 1990, C
4.4.3 Be            Copeland, 20  004
4.5 Kaner, 2002
4.6 Beiz 1990, Co
       zer,         opeland, 200 04




        2011
Version 2                                              Page 44 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                International
      fied Teste
 Certif        er                                                                             Software Te esting
      ation Level Sy
 Founda            yllabus                                                                   Qualifications Board
                                                                                             Q            s




5.           T
             Test Ma
                   anagem
                        ment (K3
                               3)                                                            70   utes
                                                                                            17 minu
     ing   ctives for Test Managemen
Learni Objec        r              nt
       ectives identify what you will be able t do followin the compl
The obje                                      to          ng                     h
                                                                    letion of each module.

      st      zation (K2)
5.1 Tes Organiz
       1
LO-5.1.1           Recognize the impor rtance of indeependent tessting (K1)
       2
LO-5.1.2                   the                       cks
                   Explain t benefits and drawbac of indepe                  ng          organization (K2)
                                                                 endent testin within an o
       3
LO-5.1.3           Recognize the differe team members to be considered for the creation of a test team
                                       ent
                   (K1)
       4
LO-5.1.4           Recall th tasks of a typical test l
                           he                                    ester (K1)
                                                     leader and te

      st               timation (K
5.2 Tes Planning and Est         K3)
       1
LO-5.2.1                                  ent         nd
                   Recognize the differe levels an objectives of test plann        ning (K1)
       2
LO-5.2.2           Summar   rize the purpose and content of the te plan, test design spec
                                                                    est                                      d
                                                                                              cification and test
                   procedure document according to the ‘Stand
                                          ts                         dard for Softwware Test Documentatio    on’
                   (IEEE St 829-1998) (K2)
                            td            )
       3
LO-5.2.3           Differenttiate between conceptual different te approach
                                          n             lly          est                                     m
                                                                                   hes, such as analytical, model-
                   based, mmethodical, p process/stand  dard complia dynamic/heuristic, co
                                                                     ant,                     onsultative and
                   regressioon-averse (K K2)
       4
LO-5.2.4           Differenttiate between the subject of test planning for a sy
                                          n            t                          ystem and sccheduling tes  st
                           on
                   executio (K2)
       5
LO-5.2.5           Write a ttest execution schedule f a given se of test cas
                                                       for           et           ses, consider              ation,
                                                                                               ring prioritiza
                   and techhnical and log gical dependdencies (K3)
       6
LO-5.2.6                                               on            t            be
                   List test preparation and executio activities that should b considered during test         t
                   planning (K1)
                           g
       7
LO-5.2.7           Recall tyypical factors that influenc the effort related to testing (K1)
                                         s              ce
       8
LO-5.2.8           Different              n
                            tiate between two concep   ptually differe estimatio approache the metric
                                                                     ent          on          es:             cs-
                   based ap              d
                            pproach and the expert-b                 ach
                                                       based approa (K2)
       9
LO-5.2.9           Recognize/justify ade  equate entry and exit crit
                                                      y                           cific       els
                                                                     teria for spec test leve and group of   ps
                   test case (e.g., for integration te
                            es                        esting, accep                g           es
                                                                    ptance testing or test case for usability
                            (K2)
                   testing) (

      st      ss      ring and C
5.3 Tes Progres Monitor                 K2)
                               Control (K
       1
LO-5.3.1                   ommon metr used for monitoring test preparation and exec
                   Recall co            rics                                              cution (K1)
       2
LO-5.3.2                   and          e
                   Explain a compare test metrics for test rep
                                                    s                         est        e.g., defects found
                                                                 porting and te control (e             f
                           d,
                   and fixed and tests ppassed and f             ed           e
                                                    failed) relate to purpose and use (K K2)
       3
LO-5.3.3           Summar                                        est         y
                           rize the purpose and content of the te summary report document accord      ding to
                           ndard for Sof
                   the ‘Stan                        Documentatio (IEEE Std 829-1998) (K2)
                                        ftware Test D             on’

               on     ement (K2)
5.4 Configuratio Manage
       1
LO-5.4.1                rize how configuration ma
                   Summar                                 s           ting (K2)
                                                anagement supports test

      sk      sting (K2)
5.5 Ris and Tes
       1
LO-5.5.1                   e
                   Describe a risk as a possible pro oblem that wo              n           ement of one or
                                                                    ould threaten the achieve           e
                          akeholders’ p
                   more sta              project objectives (K2)
       2
LO-5.5.2           Rememb that the level of risk is determined by likelihoo (of happen
                           ber                        s             d           od          ning) and impact
                          esulting if it does happen) (K1)
                   (harm re                           )
       3
LO-5.5.3           Distinguish between the project a product risks (K2)
                                                      and
       4
LO-5.5.4           Recognize typical product and pr                 K1)
                                                      roject risks (K
       5
LO-5.5.5                   e,
                   Describe using exam    mples, how r analysis and risk man
                                                     risk                       nagement may be used for test
                                                                                                        f
                          g
                   planning (K2)
        2011
Version 2                                                Page 45 of 78                                    ar-2011
                                                                                                      31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                         International
      fied Teste
 Certif        er                                                                      Software Te esting
      ation Level Sy
 Founda            yllabus                                                            Qualifications Board
                                                                                      Q            s




5.6 Inc        nagement (K3)
      cident Man
       1
LO-5.6.1                               ent
                   Recognize the conte of an inciddent report according to t ‘Standard for Softwar
                                                              a              the        d           re
                   Test Doc             ’         829-1998) (K
                           cumentation’ (IEEE Std 8           K1)
       2
LO-5.6.2                               port
                   Write an incident rep covering the observa                ure       esting. (K3)
                                                              ation of a failu during te




        2011
Version 2                                              Page 46 of 78                               ar-2011
                                                                                               31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                         International
      fied Teste
 Certif        er                                                                      Software Te esting
      ation Level Sy
 Founda            yllabus                                                            Qualifications Board
                                                                                      Q            s




5.1            Test Organizat
                            tion (K2)                                                          tes
                                                                                        30 minut

Terms
                      est     r
Tester, test leader, te manager

5.1.1                 ganization and Indep
               Test Org                         e
                                         pendence (K2)
                                                 g          ws
The effectiveness of finding defects by testing and review can be imp    proved by ussing independent
testers. O            ndependenc include the following:
         Options for in            ce           e
o No in  ndependent testers; deve                their own cod
                                    elopers test t            de
o Inde  ependent test              he            ent
                      ters within th developme teams
o Inde  ependent test team or gro within the organizatio reporting to project management or
                      t            oup           e           on,                                   o
    execcutive manag  gement
o Inde  ependent test              e
                      ters from the business or              o
                                                 rganization or user commmunity
o Inde  ependent test specialists f specific te types suc as usability testers, se
                      t              for         est          ch                                  rs
                                                                                     ecurity tester or
                      ers           tify        re
    certification teste (who cert a softwar product ag                   ards and regu
                                                             gainst standa           ulations)
o Inde  ependent test               ced         nal
                      ters outsourc or extern to the org     ganization

For large complex o safety critic projects, it is usually best to have multiple leve of testing, with
        e,           or          cal                         b                         els
                     vels done by independent testers. Development s
some or all of the lev                                                                 ticipate in tes
                                                                          staff may part             sting,
especially at the lowe levels, but their lack of objectivity often limits th effectiveness. The
                      er         t             f             o             heir
independ dent testers may have th authority to require and define test processes a rules, bu
                                he             o             d                        and           ut
        should take o such process-related r
testers s            on                        roles only in the presence of a clear m
                                                                           e          management     t
        e
mandate to do so.

The benefits of indep
                    pendence incclude:
o Indeependent test             er          ent         a
                    ters see othe and differe defects, and are unbiased
o An inndependent tester can ve erify assumpptions people made during specificatio and
                                                        e                        on
       ementation o the system
   imple            of          m

Drawbac include:
      cks
      ation from the developme team (if tr
o Isola            e          ent          reated as tot tally independent)
o Deve                         e
       elopers may lose a sense of respons               ality
                                          sibility for qua
o Indeependent tes                        bottleneck or blamed for d
                   sters may be seen as a b                                        ease
                                                                      delays in rele

Testing t             e
         tasks may be done by pe eople in a specific testing role, or may be done by someone in
                                                           g            y          y          n
                      s
another role, such as a project m            ality          r,                     nd
                                 manager, qua manager developer, business an domain ex        xpert,
         cture or IT operations.
infrastruc

5.1.2                 f                  nd       (
               Tasks of the Test Leader an Tester (K1)
         yllabus two te positions are covered test leader and tester. The activities and tasks
In this sy            est       s           d,          r
         ed          e          o
performe by people in these two roles depen on the pro
                                            nd          oject and pro             xt,         e
                                                                     oduct contex the people in the
         nd
roles, an the organization.

Sometim the test leader is called a test ma
       mes                                   anager or tes coordinator The role of the test leader
                                                          st            r.            f
may be p             y
        performed by a project mmanager, a de evelopment manager, a q   quality assur           ger
                                                                                      rance manag or
the mana            st           arger projects two positio may exist: test leader and test
        ager of a tes group. In la                        ons                         r
        r.           he          er
manager Typically th test leade plans, mon                ontrols the testing activitie and tasks as
                                              nitors and co                           es        s
        in
defined i Section 1.4.

Typical t            asks may inc
        test leader ta          clude:
o Coordinate the te strategy a plan with project managers and o
                     est         and        h                         others
o Write or review a test strateg for the pro
        e                       gy                      st            he         ion
                                           oject, and tes policy for th organizati
        2011
Version 2                                              Page 47 of 78                               ar-2011
                                                                                               31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                               International
      fied Teste
 Certif        er                                                                            Software Te esting
      ation Level Sy
 Founda            yllabus                                                                  Qualifications Board
                                                                                            Q            s


o     Cont               esting perspe
            tribute the te                            er
                                        ective to othe project act                            on
                                                                   tivities, such as integratio planning
o          n
      Plan the tests – c                 the          and
                          considering t context a understa                                     s
                                                                   anding the test objectives and risks –
      inclu               ng
           uding selectin test appro    oaches, estim mating the tim effort and cost of testing, acquirin
                                                                   me,                                     ng
      reso                ing                         nd
          ources, defini test levels, cycles, an planning in        ncident management
o          ate
      Initia the specif                 paration, imp
                           fication, prep             plementation and execution of tests, m                est
                                                                                               monitor the te
                         ck
      results and chec the exit criteria
o           pt
      Adap planning b    based on tes results and progress (sometimes do
                                        st            d                                        n
                                                                                 ocumented in status repo  orts)
      and take any act    tion necessary to compen                 oblems
                                                       nsate for pro
o     Set u adequate configuratio managem
            up           e              on                         are
                                                     ment of testwa for tracea   ability
o     Introoduce suitabl metrics for measuring test progress and evalua
                          le             r                          s                         ality
                                                                                 ating the qua of the tes   sting
      and the product
o     Deci what sho
            ide          ould be autom  mated, to what degree, and how
o          ect
      Sele tools to su    upport testing and organi any traini in tool us for testers
                                         g             ize          ing          se           s
o           ide          e
      Deci about the implementa                        est
                                         ation of the te environm  ment
o           e
      Write test summary reports b      based on the information gathered du
                                                     e                           uring testing

Typical t            may
        tester tasks m include:
o Revi  iew and cont               t
                     tribute to test plans
o Anal  lyze, review and assess user requirements, specifications and models for testability
                                                                       d           r
        ate
o Crea test spec    cifications
o Set u the test environment (
        up                         (often coordinating with system administration and network
                                                            s
    management)
o Prep  pare and acqquire test data
o Implement tests on all test levels, execute and log the tests, evalu
                                               e            e                      ults
                                                                       uate the resu and docu ument
    the d            om           d
        deviations fro expected results
o Use test adminis  stration or ma anagement tools and test monitoring tools as requuired
o Auto omate tests (may be supp                developer or a test autom
                                   ported by a d                       mation expertt)
o Mea  asure performmance of com  mponents and systems (if applicable)
                                                            f
o Revi  iew tests devveloped by o others

People w work on test analysis test design specific test types or te automatio may be
        who                      s,           n,                       est           on
        sts
specialis in these ro            ding on the t
                      oles. Depend                         d           elated to the p
                                             test level and the risks re             product and the
         different peo
project, d           ople may take over the role of tester, keeping som degree of independence.
                                 e                          k         me
        y             he         nt
Typically testers at th componen and integr  ration level would be developers, test
                                                           w                         ters at the
         nce         el                       erts         ers,
acceptan test leve would be business expe and use and teste for operat ers                       tance
                                                                                     tional accept
        would be ope
testing w            erators.




        2011
Version 2                                              Page 48 of 78                                      ar-2011
                                                                                                      31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s




5.2                  lanning and Est
               Test Pl             timation (K3)                                                tes
                                                                                         40 minut

Terms
       proach, test s
Test app            strategy

5.2.1                  nning (K2)
               Test Plan        )
This sec             the          of                   evelopment a impleme
        ction covers t purpose o test planning within de            and                       ects,
                                                                                 entation proje
        maintenance activities. Planning may be documen
and for m            e                      y                                    n
                                                        nted in a master test plan and in sepaarate
        ns           vels such as system testin and acceptance testin The outlin of a test-
test plan for test lev                        ng                    ng.           ne
        g             s          y           ard        are
planning document is covered by the ‘Standa for Softwa Test Doc     cumentation’ (IEEE Std 829-
                                                                                              8
1998).

Planning is influence by the test policy of the organizatio the scope of testing, o
       g            ed            t           e             on,         e            objectives, risks,
        nts,        y,            and
constrain criticality testability a the avail               sources. As the project an test plann
                                              lability of res                        nd          ning
       s,
progress more inform mation becomes availabl and more detail can be included in the plan.
                                              le                         e           n

Test plan           ontinuous act
        nning is a co                            performed in all life cycle processes a activities
                                   tivity and is p                                     and        s.
        ck
Feedbac from test a               used to recog
                    activities is u              gnize changin risks so th planning can be adjusted.
                                                             ng             hat

5.2.2                  nning Activities (K3
               Test Plan                  3)
        nning activities for an ent system o part of a sy
Test plan                          tire         or              ystem may in nclude:
o Dete  ermining the scope and ri  isks and iden ntifying the objectives of testing
                                                               o
o Defin                            h
        ning the overall approach of testing, i                e             f           els         y
                                                 including the definition of the test leve and entry and
   exit ccriteria
o Integ grating and ccoordinating the testing a                             e            ctivities
                                                activities into the software life cycle ac
   (acquisition, supply, developm  ment, operat  tion and maintenance)
o Mak                 s            t
       king decisions about what to test, wha roles will perform the te activities, how the tes
                                                at             p            est          ,           st
   activ                                        est             ll
        vities should be done, and how the te results wil be evaluate        ed
o Sche               analysis and design activ
        eduling test a                          vities
o Sche               implementati
        eduling test i                          on
                                   ion, executio and evalua     ation
o Assigning resour                different activ
                     rces for the d              vities definedd
o Defin ning the amo              f             cture and tem
                     ount, level of detail, struc                           he
                                                               mplates for th test docum mentation
o Sele               s
       ecting metrics for monitor ring and cont  trolling test preparation a execution defect reso
                                                               p            and          n,           olution
   and risk issues
o Setti the level of detail for test procedu
        ing                                     ures in order to provide en  nough inform            pport
                                                                                         mation to sup
   repro             t            n
        oducible test preparation and executi    ion

5.2.3                            )
               Entry Criteria (K2)
                      when to start testing such as at the beginning of a test level or when a set of
Entry criteria define w           t            h                                                 t
tests is r           ecution.
         ready for exe

Typically entry criteria may cover the following:
        y                         r
o Test environmen availability and readine
        t            nt                       ess
o Test tool readine in the tes environment
        t            ess          st
o Test  table code av vailability
        t
o Test data availab   bility

5.2.4                   eria
               Exit Crite (K2)
         eria       hen      testing such as at the end of a test lev or when a set of tests has
Exit crite define wh to stop t                        d             vel                    s
         d          al.
achieved specific goa

        2011
Version 2                                              Page 49 of 78                                ar-2011
                                                                                                31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                             International
      fied Teste
 Certif        er                                                                          Software Te esting
      ation Level Sy
 Founda            yllabus                                                                Qualifications Board
                                                                                          Q            s


Typically exit criteria may cover t following:
        y                         the
o Thor  roughness m   measures, such as covera of code, functionality or risk
                                             age                    y
o Estim                ect       or         measures
        mates of defe density o reliability m
o Cost  t
o Resi  idual risks, such as defec not fixed or lack of tes coverage i certain are
                                  cts                     st         in          eas
o Sche                             sed       to
        edules such as those bas on time t market

5.2.5                  imation (K
               Test Esti        K2)
      proaches for the estimatio of test effo are:
Two app                         on          ort
o The metrics-base approach estimating the testing effort based o metrics of former or si
                   ed          h:                   e           on         f            imilar
       ects or based on typical v
   proje           d            values
o The expert-based approach: estimating the tasks bas on estimates made b the owner of the
                                                    sed                   by
   tasks or by experts
       s

Once the test effort is estimated, resources c be identif
       e              s                      can                    chedule can b drawn up
                                                        fied and a sc           be       p.

                     ay            n            of
The testing effort ma depend on a number o factors, inc     cluding:
o Characteristics o the produc the quality of the speci
                     of           ct:          y                                      mation used fo test
                                                             ification and other inform             or
   models (i.e., the test basis), t size of th product, th complexit of the prob
                                   the         he           he            ty           blem domain, the
   requ              r             nd           and
       uirements for reliability an security, a the requi    irements for documentati   ion
                     of
o Characteristics o the development proce                    ility
                                               ess: the stabi of the org                ools used, te
                                                                           ganization, to           est
   proc              f
       cess, skills of the people i             d
                                    involved, and time pressure
o The outcome of t                 number of de
                      testing: the n                        he            f
                                                efects and th amount of rework requ    uired

5.2.6                  ategy, Tes Approac (K2)
               Test Stra        st      ch
The test approach is the impleme                he                         cific
                                   entation of th test strategy for a spec project. The test app   proach
is define and refined in the test plans and te designs. It typically inc
        ed             d                        est                        cludes the de           de
                                                                                        ecisions mad
         n
based on the (test) p              l            sessment. It is the startin point for planning the test
                      project’s goal and risk ass                         ng                       t
         ,            g
process, for selecting the test dessign techniqu and test types to be applied, and for defining the
                                                 ues                                   d
         d            a.
entry and exit criteria

       ected approa depends on the conte and may consider risk hazards a safety,
The sele            ach                    ext                      ks,         and
        e            and                   y,                                   tom built vs.
available resources a skills, the technology the nature of the system (e.g., cust
COTS), t             es,
        test objective and reguulations.

Typical aapproaches i  include:
o Anal   lytical approaaches, such as risk-base testing where testing is directed to areas of gre
                                                 ed                          s                          eatest
    risk
o Model-based app                   uch
                       proaches, su as stocha                               tical informat
                                                  astic testing using statist              tion about faiilure
    rates (such as re
         s                          wth           or
                      eliability grow models) o usage (such as operat        tional profiless)
o Meth   hodical appro              h
                       oaches, such as failure-b based (includ ding error guessing and f   fault attacks),
    expe              ed,
         erience-base checklist-    -based, and q quality chara             sed
                                                               acteristic-bas
o Proc                               ant
         cess- or standard-complia approach       hes, such as those specif  fied by indus stry-specific
    standards or the various agile methodolo
                                     e           ogies
o Dyna   amic and heuristic approaches, such as explorato testing w
                                                               ory         where testing is more reac    ctive to
    even than pre-planned, and where exec
         nts                        d             cution and ev             e
                                                                valuation are concurrent tasks
o Cons                proaches, such as those in which test coverage is driven primarily by the advice
          sultative app                                        t            s                            a
    and guidance of technology a     and/or business domain experts outs    side the test t team
o Regression-aver approach
                      rse           hes, such as those that in nclude reuse of existing test material,
    extensive automation of func                 ssion tests, and standard test suites
                                    ctional regres             a            d

Different approaches may be com
        t          s                      example, a risk-based dynamic appro
                              mbined, for e                                 oach.



        2011
Version 2                                              Page 50 of 78                                   ar-2011
                                                                                                   31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                            International
      fied Teste
 Certif        er                                                                         Software Te esting
      ation Level Sy
 Founda            yllabus                                                               Qualifications Board
                                                                                         Q            s




          rogress Monitor
5.3 Test Pr             ring and Control                                                         tes
                                                                                          20 minut
(K2)
Terms
                     re           control, test m
Defect density, failur rate, test c                          est     y
                                                monitoring, te summary report

5.3.1                          nitoring (K
               Test Progress Mon         K1)
The purp  pose of test m            s
                        monitoring is to provide f
                                                 feedback and visibility ab
                                                              d            bout test activ             mation
                                                                                          vities. Inform
to be mo               y            d                        ally
         onitored may be collected manually or automatica and may be used to m            measure exit  t
          such as cove
criteria, s             erage. Metric may also b used to assess progre against t planned
                                    cs            be                        ess           the
          e             t.
schedule and budget Common t        test metrics include:
o Perc                  ork
         centage of wo done in t    test case pre                                          est
                                                 eparation (or percentage of planned te cases
     preppared)
o Perc                  ork
         centage of wo done in t    test environmment preparaation
          t
o Test case execution (e.g., nu                   t          not
                                    umber of test cases run/n run, and t    test cases pa assed/failed) )
         ect            on          ect
o Defe informatio (e.g., defe density, d         defects found and fixed, f
                                                             d                            and
                                                                            failure rate, a re-test re  esults)
          t             f           nts,
o Test coverage of requiremen risks or c         code
o Subj    jective confid            ters in the pr
                        dence of test            roduct
         es
o Date of test mile     estones
o Test    ting costs, including the c            ed
                                    cost compare to the ben  nefit of finding the next de efect or to run the
         t
     next test

5.3.2                 porting (K2
               Test Rep         2)
                                               g          n
Test reporting is concerned with summarizing information about the te             avor, includin
                                                                       esting endea            ng:
o Wha happened during a per
       at                                      g,         ates when ex criteria we met
                                 riod of testing such as da            xit        ere
o Anal  lyzed informaation and me etrics to supp recommendations an decisions about future
                                               port                    nd                      e
       ons, such as an assessm
   actio                         ment of defects remaining the econom benefit of continued
                                                         g,            mic         f
   testing, outstandding risks, and the level o confidence in the tested software
                                              of         e             d

The outline of a test summary rep is given in ‘Standard for Software Test Docum
                                port                  d            e          mentation’ (IEEE
Std 829-
       -1998).

Metrics s          ollected durin and at the end of a tes level in ord to assess
        should be co            ng                      st           der       s:
                    f                      hat
o The adequacy of the test objectives for th test level
o The adequacy of the test app
                    f                      ken
                                proaches tak
o The effectivenes of the testi with resp
                   ss            ing                    bjectives
                                           pect to the ob

5.3.3                 ntrol (K2)
               Test Con
       ntrol describe any guidin or correcti actions ta
Test con              es       ng          ive                    sult
                                                      aken as a res of inform           m
                                                                             mation and metrics
        d             ed.      may        ny          ty
gathered and reporte Actions m cover an test activit and may a   affect any oth software life
                                                                              her
        tivity or task.
cycle act             .

Example of test con
      es            ntrol actions include:
o Mak                s
      king decisions based on in  nformation fr rom test mon nitoring
o Re-pprioritizing tests when an identified ris occurs (e.g., software delivered lat
                                               sk                                        te)
o Changing the tes schedule d to availa
                     st          due                         vailability of a test environ
                                               ability or unav                           nment
o Setti an entry criterion requ
       ing                                      o
                                  uiring fixes to have been re-tested (confirmation t    tested) by a
      eloper before accepting them into a b
   deve             e                          build




        2011
Version 2                                              Page 51 of 78                                  ar-2011
                                                                                                  31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                    International
      fied Teste
 Certif        er                                                                 Software Te esting
      ation Level Sy
 Founda            yllabus                                                       Qualifications Board
                                                                                 Q            s




5.4            Configu
                     uration M    ement (K
                             Manage      K2)                                              tes
                                                                                   10 minut

Terms
       ration manag
Configur                     sion control
                  gement, vers

     round
Backgr
The purp                                                                          ty          ducts
         pose of configuration management is to establish and maintain the integrit of the prod
(compon              and
         nents, data a documen  ntation) of the software or system thro
                                              e           r                       ject and prod
                                                                      ough the proj           duct
         e.
life cycle

For testing, configura
                     ation manageement may involve ensur   ring the following:
o All items of testw            ntified, versio controlled, tracked for changes, related to each other
                    ware are iden             on                                               h
    and related to de            tems (test ob
                     evelopment it                         at            y
                                              bjects) so tha traceability can be maintained
    throu            est
         ughout the te process
o All id            cuments and software item are refere
        dentified doc                          ms          enced unambiguously in test
    docuumentation

For the t            guration management hel to unique identify (a to reprod
        tester, config                      lps        ely       and                   sted
                                                                           duce) the tes
        st           s,                    harness(es).
item, tes documents the tests and the test h

During te planning, the configur
        est       ,                       gement procedures and i
                               ration manag                                  e            uld
                                                                infrastructure (tools) shou be
                  d
chosen, documented and implem mented.




        2011
Version 2                                              Page 52 of 78                          ar-2011
                                                                                          31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s




5.5            Risk an Testing (K2)
                     nd                                                                       tes
                                                                                       30 minut

Terms
                                                ting
Product risk, project risk, risk, risk-based test

     round
Backgr
        n                       ce           nt,
Risk can be defined as the chanc of an even hazard, th                ation occurrin and result
                                                         hreat or situa            ng            ting in
undesiraable consequ            potential prob
                    uences or a p                                                  ined by the
                                             blem. The level of risk will be determi
         d          erse event ha
likelihood of an adve           appening and the impact (the harm re
                                             d                        esulting from that event).

5.5.1                  Risks (K2)
               Project R        )
        risks are the risks that surround the project’s capa
Project r                                                                   ver
                                                             ability to deliv its object tives, such as:
o Orga  anizational faactors:
         • Skill, trai              aff
                       ining and sta shortages  s
         • Personn issues
                      nel
         • Political issues, such as:
                                    h
                       Problems with testers co              g             s
                                                ommunicating their needs and test res    sults
                       Failure by the team to follow up on innformation fo              ng
                                                                           ound in testin and review ws
                       (e.g., not imp           elopment and testing prac
                                     proving deve            d              ctices)
         • Improper attitude tow                             esting (e.g., n appreciating the value of
                                    ward or expectations of te              not
              finding d              g
                      defects during testing)
o Tech  hnical issues s:
         • Problem in defining the right req
                     ms                         quirements
         • The exte to which r
                      ent                        s
                                     requirements cannot be met given ex                 raints
                                                                            xisting constr
         • Test env                 ot
                      vironment no ready on tim  me
         • Late data conversion migration p
                        a           n,           planning and development and testing data
                                                             d
              conversi              n
                       ion/migration tools
         • Low qua of the de
                      ality                      configuration data, test data and tests
                                    esign, code, c           n                           s
o Supp   plier issues:
         • Failure o a third part
                      of             ty
         • Contract    tual issues

When an nalyzing, managing and m                ese        e          ger         ng         blished
                                   mitigating the risks, the test manag is followin well-estab
        management principles. T ‘Standar for Software Test Docu
project m             t           The           rd                                (IEEE Std 82
                                                                      umentation’ (          29-
1998) ou              t            res          d          ies
        utline for test plans requir risks and contingenci to be stat ted.

5.5.2                          2)
               Product Risks (K2
                     as
Potential failure area (adverse f              s              )                           m
                                  future events or hazards) in the software or system are known as   n
product r             y             o
        risks, as they are a risk to the quality of the produ uct. These in  nclude:
o Failu ure-prone software delive ered
o The potential tha the softwar
                     at            re/hardware could cause harm to an individual or company
                                                              e                           r
o Poor software ch
        r            haracteristics (e.g., functionality, reliability, usability and perfor
                                  s                                                        rmance)
o Poor data integri and qualit (e.g., data migration issues, data c
        r             ity          ty                                       conversion pr             a
                                                                                          roblems, data
        sport problem violation of data standards)
    trans            ms,
o Softw ware that does not perfor its intended functions
                                   rm

Risks are used to de
         e                        to          ng          e           e;
                    ecide where t start testin and where to test more testing is uused to reduce the
         n           fect occurring, or to reduce the impac of an adve
risk of an adverse eff                                    ct         erse effect.




        2011
Version 2                                              Page 53 of 78                              ar-2011
                                                                                              31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                           International
      fied Teste
 Certif        er                                                                        Software Te esting
      ation Level Sy
 Founda            yllabus                                                              Qualifications Board
                                                                                        Q            s


Product risks are a s            of                                    sting as a ris
                    special type o risk to the success of a project. Tes                         tivity
                                                                                    sk-control act
provides feedback ab
       s             bout the residual risk by measuring th effectiven
                                                          he                        al
                                                                      ness of critica defect remmoval
        ontingency p
and of co           plans.

A risk-baased approac to testing provides pro
                        ch                         oactive oppo                 educe the lev
                                                                 ortunities to re                       uct
                                                                                            vels of produ
         rting in the in
risk, star                           of            It                           on         ct
                       nitial stages o a project. I involves the identificatio of produc risks and th   heir
use in gu uiding test planning and c  control, spec              eparation and execution o tests. In a risk-
                                                   cification, pre              d           of
based ap                              ed
          pproach the risks identifie may be used to:
o Dete   ermine the te technique to be employed
                       est           es
o Dete   ermine the ex                ng
                        xtent of testin to be carr ried out
o Prior   ritize testing in an attemp to find the critical defec as early a possible
                                     pt                           cts           as
o Dete                  her
         ermine wheth any non-t                    ities could be employed t reduce risk (e.g., provi
                                      testing activi              e             to                      iding
     training to inexpe erienced des signers)

Risk-bas testing draws on the collective kn
       sed                                              d           he          akeholders to
                                           nowledge and insight of th project sta          t
       ne          and        s            required to ad
determin the risks a the levels of testing r                       e
                                                        ddress those risks.

To ensur that the ch
         re                                   e          ed,       agement acti
                    hance of a product failure is minimize risk mana          ivities provide a
discipline approach to:
         ed
o Asse (and reassess on a regular basis) what can go wrong (risk
         ess                                                       ks)
o Dete   ermine what risks are imp
                                 portant to deal with
o Implement action to deal wit those risks
                    ns           th

In additio testing m support t identificat
         on,       may       the         tion of new risks, may he to determ
                                                     r           elp       mine what risk
                                                                                        ks
        be         and                   ty
should b reduced, a may lower uncertaint about risks s.




        2011
Version 2                                              Page 54 of 78                                  ar-2011
                                                                                                  31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s




5.6                  nt
               Inciden Manag
                           gement (K3)                                                          tes
                                                                                         40 minut

Terms
                                ement, incide report
Incident logging, incident manage           ent

     round
Backgr
        ne
Since on of the obje               sting is to find defects, the discrepanc
                      ectives of tes              d                                    n
                                                                          cies between actual and
        d
expected outcomes n   need to be loogged as inc  cidents. An in            t
                                                               ncident must be investigaated and may turn
        e
out to be a defect. Apppropriate acctions to disp                         cts          e
                                                  pose incidents and defec should be defined. Inc   cidents
and defe             be            om             y
        ects should b tracked fro discovery and classification to cor                               n
                                                                           rrection and confirmation of the
                      manage all in
solution. In order to m            ncidents to c              a            on
                                                 completion, an organizatio should es               ncident
                                                                                       stablish an in
management proces and rules f classificat
                     ss            for            tion.

Incidents may be rais during development, review, test
         s           sed                     ,                        f                       ey
                                                         ting or use of a software product. The may
          d          in                     ystem, or in any type of d
be raised for issues i code or the working sy            a                        on
                                                                     documentatio including
requirem            opment docu
        ments, develo                        documents, and user info
                                uments, test d                        ormation suc as “Help” or
                                                                                  ch
          on
installatio guides.

Incident reports have the followin objectives
                     e            ng         s:
o Prov                ers         er         h            a
        vide develope and othe parties with feedback about the pro    oblem to enable identifica
                                                                                               ation,
        ation and cor
    isola            rrection as neecessary
o Prov               ders a means of tracking the quality of the system under test a the progress
        vide test lead             s                      o           m            and
    of the testing
o Prov                r           ss
        vide ideas for test proces improveme ent

Details o the inciden report may include:
        of            nt            y
        e
o Date of issue, iss  suing organiz  zation, and aauthor
o Expe  ected and ac  ctual results
                      the                         on
o Identification of t test item (configuratio item) and environment            t
o Softw                em
        ware or syste life cycle process in w     which the inc cident was ob  bserved
o Desc  cription of the incident to enable repro   oduction and resolution, including log database
                                                                 d                        gs,         e
       mps
   dum or screen      nshots
        pe            e              n
o Scop or degree of impact on stakeholde          er(s) interestss
o Seve                mpact on the system
        erity of the im
o Urge  ency/priority to fix
        us
o Statu of the inci    ident (e.g., o
                                    open, deferre duplicate, waiting to b fixed, fixed awaiting re
                                                 ed,             ,            be          d           e-test,
   closeed)
o Conc   clusions, rec              ons
                      commendatio and appr        rovals
        bal           uch                        may             ted
o Glob issues, su as other areas that m be affect by a change resulting from the inc      g            cident
o Change history, such as the sequence of actions take by project team memb
                                                  f              en            t                      spect
                                                                                          bers with res
   to the incident to isolate, repa and conf
                     o               air,                        d
                                                  firm it as fixed
o Refe  erences, inclu uding the ide              test case spe
                                    entity of the t                            at         the
                                                                 ecification tha revealed t problem

The structure of an in                        vered in the ‘Standard for Software Te
                     ncident report is also cov                        r           est
Docume              EE
       entation’ (IEE Std 829-1998).




        2011
Version 2                                              Page 55 of 78                                 ar-2011
                                                                                                 31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                          International
      fied Teste
 Certif        er                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                             Qualifications Board
                                                                       Q            s


     ences
Refere
                   Hetzel, 1988
5.1.1 Black, 2001, H
5.1.2 Black, 2001, H
                   Hetzel, 1988
5.2.5 Black, 2001, C            IEEE Std 829
                   Craig, 2002, I           9-1998, Kaner 2002
5.3.3 Black, 2001, C            Hetzel, 1988 IEEE Std 829-1998
                   Craig, 2002, H          8,         8
5.4 Crai 2002
        ig,
5.5.2 Black, 2001 , IEEE Std 8299-1998
5.6 Blac 2001, IEE Std 829-1
       ck,         EE          1998




        2011
Version 2                                              Page 56 of 78               ar-2011
                                                                               31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                              International
      fied Teste
 Certif        er                                                                           Software Te esting
      ation Level Sy
 Founda            yllabus                                                                 Qualifications Board
                                                                                           Q            s




6.           Tool Su
             T     upport f Testi (K2)
                          for   ing  )                                                     80
                                                                                           8 minutes
     ing   ctives for Tool Sup
Learni Objec        r        pport for Testing
       ectives identify what you will be able t do followin the compl
The obje                                      to          ng                     h
                                                                    letion of each module.

      pes    st       K2)
6.1 Typ of Tes Tools (K
       1
LO-6.1.1           Classify different types of test too according to their purpose and to the activities of
                                                      ols       g                                      s
                   the funda             t            d         re
                           amental test process and the softwar life cycle ( (K2)
LO-6.1.3
       3           Explain t term test tool and the purpose of tool support for testing (K 2
                           the           t            e                                  K2)

       ective Use of Tools Potentia Benefits and Risk (K2)
6.2 Effe        e        s:       al       s        ks
       1
LO-6.2.1           Summar   rize the poten
                                         ntial benefits and risks of test automa
                                                      s            f                       ol
                                                                               ation and too support forr
                            K2)
                   testing (K
       2
LO-6.2.2                    ber
                   Rememb special c     consideration for test exe
                                                      ns                       s,           ysis, and tes
                                                                   ecution tools static analy           st
                   management tools (K   K1)

       roducing a Tool into an Orga
6.3 Intr                  o       anization (K1)
       1
LO-6.3.1                   e
                   State the main princi              ducing a tool into an orga
                                        iples of introd                                     1)
                                                                                anization (K1
       2
LO-6.3.2                   e            proof-of-conc
                   State the goals of a p             cept for tool evaluation and a piloting phase for to
                                                                                                         ool
                   impleme entation (K1)
       3
LO-6.3.3           Recognize that facto other than simply acquiring a tool are required for good too
                                       ors            n                                                  ol
                   support (K1)




2
                      ally
    LO-6.1.2 Intentiona skipped
        2011
Version 2                                               Page 57 of 78                                   ar-2011
                                                                                                    31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                            International
      fied Teste
 Certif        er                                                                         Software Te esting
      ation Level Sy
 Founda            yllabus                                                               Qualifications Board
                                                                                         Q            s




6.1                          Tools (K
               Types of Test T      K2)                                                          tes
                                                                                          45 minut

Terms
Configur            gement tool, coverage too debugging tool, dynam analysis tool, inciden
         ration manag                         ol,                     mic                       nt
management tool, load testing to modeling tool, monito
                                ool,         g         oring tool, performance te esting tool, probe
         equirements managemen tool, review tool, security tool, static analysis tool, stress tes
effect, re                       nt          w                        c                         sting
         t          r,           reparation to test desig tool, test h
tool, test comparator test data pr           ool,       gn            harness, test execution tool,
                                                                                   t
test man            ol,          ramework too
         nagement too unit test fr            ol

6.1.1                 pport for T
               Tool Sup                  K2)
                                Testing (K
        ls            ed            r
Test tool can be use for one or more activit     ties that support testing. These inclu ude:
1. Tools that are dir               n           ch
                       rectly used in testing suc as test exe             s,
                                                              ecution tools test data ge  eneration too ols
   and result compa   arison tools
2. Tools that help in managing t testing process such as those used to manag tests, test
                      n             the                                                 ge
   results, data, req quirements, incidents, defects, etc., and for report ting and mon  nitoring test
   exec cution
3. Tools that are us in reconn
                      sed           naissance, or, in simple terms: explor
                                                              t                          tools that mo
                                                                           ration (e.g., t             onitor
   file a             n
        activity for an application))
                       s            a           eet           t
4. Any tool that aids in testing (a spreadshe is also a test tool in this meaning)

Tool sup                          e             e
       pport for testing can have one or more of the follow               es
                                                            wing purpose depending on the con    ntext:
o Impr rove the effic              t            y                        asks or suppo
                     ciency of test activities by automating repetitive ta                        al
                                                                                      orting manua test
       vities like test planning, te design, te reporting and monitor
   activ               t           est          est                      ring
o Auto                ies                       nt
       omate activiti that require significan resources when done m                    g.,
                                                                         manually (e.g static testting)
o Auto                ies         not            ted        y
       omate activiti that cann be execut manually (e.g., large scale perfor                      ng
                                                                                      rmance testin of
       nt-server app
   clien             plications)
o Incre                ty
       ease reliabilit of testing (e.g., by autoomating large data comp  parisons or siimulating
   behaavior)

The term “test frameworks” is als frequently used in the industry, in a least three meanings:
       m                        so                                      at           e
o Reus  sable and ex             ting libraries that can be used to build testing tool (called tes
                    xtensible test                                      d            ls          st
   harnnesses as weell)
o A typ of design of test autom
       pe                        mation (e.g., data-driven, keyword-driven)
                                                           ,
o Overall process of execution of testing

For the p
        purpose of th syllabus, the term “tes framework is used in its first two meanings as
                    his                     st        ks”                                  s
       ed
describe in Section 6.1.6.

6.1.2                 ol        cation (K2
               Test Too Classific        2)
        re
There ar a number of tools that support diffe             s             Tools can be classified based
                                              erent aspects of testing. T          e
        ral         uch                       cial
on sever criteria su as purpose, commerc / free / op      pen-source / shareware, technology used
        orth. Tools a classified in this syllab according to the testi activities that they support.
and so fo           are                       bus                       ing

Some to               upport one a
        ools clearly su                          rs          ort          n         y,
                                   activity; other may suppo more than one activity but are
         d
classified under the a              which they are most clos
                      activity with w                                     ed.      om
                                                            sely associate Tools fro a single
provider, especially t             ave
                      those that ha been des                                        dled into one
                                                 signed to work together, may be bund           e
package e.

Some types of test to  ools can be intrusive, which means th they can affect the ac
                                                           hat                                 e
                                                                                  ctual outcome of
the test. For example the actual timing may b different due to the ex instructio that are
                       e,                       be         d         xtra         ons
executed by the tool, or you may get a differe measure of code cove
         d             ,          y            ent                   erage. The c             e
                                                                                  consequence of
        e              ed         e
intrusive tools is calle the probe effect.
        2011
Version 2                                              Page 58 of 78                                  ar-2011
                                                                                                  31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s



Some to            pport more ap
      ools offer sup                       or         s                          d
                               ppropriate fo developers (e.g., tools that are used during
       ent
compone and component integ   gration testing Such tools are marke with “(D)” i the list bel
                                            g).                    ed            in        low.

6.1.3                 pport for M
               Tool Sup                ent              Tests (K1)
                                Manageme of Testing and T
                                st           over the entir software life cycle.
Management tools apply to all tes activities o            re

Test Ma anagement T  Tools
These toools provide interfaces for executing ttests, tracking defects an managing requirement
                                                                        nd                    ts,
along with support fo quantitative analysis an reporting of the test objects. They also suppor
                     or            e           nd                                            rt
         he          cts
tracing th test objec to require                                        an                   c
                                  ement specifications and might have a independent version control
                      face to an ex
capability or an interf           xternal one.

Require  ements Mana  agement To ools
These to              quirement sta
         ools store req                                    butes for the requirement (including
                                  atements, store the attrib                       ts
                      que        rs          ort           e            nts        ual
priority), provide uniq identifier and suppo tracing the requiremen to individu tests. Th      hese
         ay           with        ng         ent
tools ma also help w identifyin inconsiste or missing requirements.

         t           ent
Incident Manageme Tools (D        Defect Tracking Tools)
These to ools store and manage in ncident repor i.e., defec failures, change requ
                                              rts,        cts,                     uests or percceived
         s
problems and anoma                elp          ing
                      alies, and he in managi the life cy              ents, optionally with supp for
                                                          ycle of incide                        port
         al
statistica analysis.

Configuuration Mana  agement Tools
       h              test tools, the are nece
Although not strictly t             ese                   orage and ve
                                             essary for sto                      gement of
                                                                      ersion manag
testware and related software especially whe configuring more than one hardware/software
       e                                    en            g
       ment in terms of operating system ver
environm             s              g        rsions, comp             ers,
                                                         pilers, browse etc.

6.1.4                 pport for S
               Tool Sup                   ting (K1)
                                Static Test
         sting tools pr
Static tes                        t            ay                                   er          he
                      rovide a cost effective wa of finding more defects at an earlie stage in th
developm ment process s.

Review Tools
These toools assist with review pro          ecklists, revie guideline and are us to store and
                                  ocesses, che             ew        es         sed          a
commun              w             and
        nicate review comments a report on defects and effort. They can be of f
                                             n             d          y                      b
                                                                                further help by
        g             ne          or         eographically dispersed t
providing aid for onlin reviews fo large or ge             y          teams.

Static A            ols
       Analysis Too (D)
These to            velopers and testers find defects prior to dynamic testing by providing support
        ools help dev                                     r
        rcing coding standards (in
for enfor                                   cure coding), analysis of s
                                 ncluding sec                                     nd
                                                                      structures an dependen  ncies.
         n          n            r
They can also help in planning or risk analysi by providin metrics fo the code (e
                                             is           ng         or                        xity).
                                                                                   e.g., complex

Modelin Tools (D)
      ng
       ools are used to validate software mo
These to           d                        odels (e.g., physical data model (PDM for a relational
                                                                                  M)
        e),
database by enume  erating inconnsistencies and finding de            e           often aid in
                                                          efects. These tools can o
                                ed
generating some test cases base on the mo   odel.


6.1.5                 pport for T
               Tool Sup                  ification (K
                                Test Speci          K1)
Test Dessign Tools
       ools are used to generate test inputs or executable tests and/o test oracle from
These to           d            e                                    or           es
       ments, graphi
requirem                        erfaces, desig models (s
                   ical user inte            gn                      r            code.
                                                        state, data or object) or c


        2011
Version 2                                              Page 59 of 78                           ar-2011
                                                                                           31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s


        ta
Test Dat Preparation Tools
        a           n           pulate databases, files or data transm
Test data preparation tools manip                                                             ata
                                                                     missions to set up test da to
                   execution of t
be used during the e                        ure
                                tests to ensu security thhrough data anonymity.

6.1.6                 pport for T
               Tool Sup                  ution and Logging (
                                Test Execu                 (K1)
Test Exeecution Toools
       ools enable te
These to                        xecuted auto
                    ests to be ex           omatically, or semi-autom
                                                         r          matically, usin stored inp
                                                                                  ng            puts
and expe           mes, through the use of a scripting language and usually prov
        ected outcom            h                                                               g
                                                                                  vide a test log for
       st          can
each tes run. They c also be u  used to recor tests, and usually support scripting languages or
                                             rd                                   g
       sed
GUI-bas configura               ameterization of data and other customization in th tests.
                   ation for para           n            d                         he

Test Harness/Unit T  Test Framewwork Tools (  (D)
                     or                      he             c
A unit test harness o framework facilitates th testing of components or parts of a system by
         ng
simulatin the enviro            hich that test object will ru through the provision of mock objects
                     onment in wh                           un,
        s
as stubs or drivers.

Test Comparators
Test commparators de  etermine diffe
                                   erences betw
                                              ween files, da
                                                           atabases or t             Test executio
                                                                       test results. T           on
        pically include dynamic co
tools typ             e            omparators, but post-exe           parison may b done by a
                                                           ecution comp               be
separate comparison tool. A test comparator may use a te oracle, especially if it is automate
        e             n                                     est                      t          ed.

        ge
Coverag Measurem      ment Tools (D)
These toools, through intrusive or non-intrusive means, me
                                                e        easure the peercentage of specific types of
                                                                                  f
        uctures that have been e
code stru                          exercised (e.g statements, branches or decisions and module or
                                                g.,                  s            s,
function calls) by a set of tests.

Security Testing To
         y           ools
These to              d           e
        ools are used to evaluate the security characterist of softwa
                                               y               tics         are. This inccludes evaluaating
         ty           ware to prote data conf
the abilit of the softw           ect                          ntegrity, authentication, a
                                                fidentiality, in                                      ,
                                                                                         authorization,
availability, and non-                         ols             y            n
                      -repudiation. Security too are mostly focused on a particular technology,
platform, and purpos se.

6.1.7                 pport for P
               Tool Sup                 nce    onitoring (K1)
                                Performan and Mo
         c
Dynamic Analysis T  Tools (D)
         c          ols         cts
Dynamic analysis too find defec that are e               w        are        ing, such as time
                                           evident only when softwa is executi
depende                                    pically used in componen and compo
        encies or memory leaks. They are typ                      nt        onent integraation
         and
testing, a when tes             ware.
                    sting middlew

Perform mance Testin  ng/Load Tessting/Stress Testing Too   ols
Performa                         or
         ance testing tools monito and report on how a sy   ystem behaves under a v              mulated
                                                                                    variety of sim
usage co              terms of num
         onditions in t          mber of concu urrent users, their ramp-u pattern, fr
                                                                        up          requency and  d
         percentage o transaction The simu
relative p            of         ns.          ulation of load is achieved by means o creating vi
                                                            d           d           of            irtual
users ca                                       ons, spread across variou test mach
        arrying out a selected set of transactio            a           us          hines commo  only
        as
known a load gener    rators.

Monitorring Tools
        ng
Monitorin tools cont            alyze, verify and report on usage of s
                    tinuously ana                                                 em        s,
                                                                     specific syste resources and
       rnings of pos
give war                        e
                   ssible service problems.

6.1.8                 pport for S
               Tool Sup                   esting Nee (K1)
                                Specific Te        eds
Data Qu uality Assessment
Data is a the center of some pro
        at                                              ersion/migrat
                                ojects such as data conve                                       tions
                                                                       tion projects and applicat
        a           s            ibutes can va in terms of criticality a volume. In such cont
like data warehouses and its attri           ary                       and                       texts,
         ed
tools nee to be emp             ata
                    ployed for da quality asssessment to review and verify the da conversio and
                                                        o                           ata         on
        2011
Version 2                                              Page 60 of 78                                ar-2011
                                                                                                31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                International
      fied Teste
 Certif        er                                                             Software Te esting
      ation Level Sy
 Founda            yllabus                                                   Qualifications Board
                                                                             Q            s


migration rules to ensure that the processed data is corre complete and complie with a pre
        n                        e                       ect,     e           es         e-
        context-spec standard
defined c           cific        d.

Other tes            xist      ility
        sting tools ex for usabi testing.




        2011
Version 2                                              Page 61 of 78                     ar-2011
                                                                                     31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s




6.2 Effectiv Use o Tools: Poten
            ve      of        ntial                                                            tes
                                                                                        20 minut
Benefits and Risks (K
                    K2)
Terms
        ven        keyword-driv testing, s
Data-driv testing, k          ven                     guage
                                         scripting lang

6.2.1                                     s         S        or      g           ools)
               Potential Benefits and Risks of Tool Support fo Testing (for all to
(K2)
       purchasing or leasing a to does not guarantee success with that tool. Each type of to
Simply p                         ool                                                        ool
        uire
may requ additional effort to ac               and
                                  chieve real a lasting be              re        tial       a
                                                           enefits. Ther are potent benefits and
       nities with the use of tools in testing, b there are also risks.
opportun             e            s             but       e

Potential benefits of using tools in
                                   nclude:
o Repe               is
        etitive work i reduced (e e.g., running regression tests, re-ente ering the sam test data, and
                                                                                      me
    chec              t
        cking against coding stan ndards)
        ater
o Grea consiste     ency and repe  eatability (e.g tests exec
                                                 g.,                      ool         me          h
                                                            cuted by a to in the sam order with the
                     ,
    same frequency, and tests de   erived from r            s)
                                                 requirements
o Obje ective assess sment (e.g., static measu              ge)
                                                ures, coverag
        e            to            n            s
o Ease of access t information about tests or testing (e                  cs
                                                            e.g., statistic and graphs about test
       gress, inciden rates and performance
    prog             nt                         e)

Risks of using tools include:
o Unre  ealistic expec              he
                      ctations for th tool (inclu                           ase
                                                 uding functionality and ea of use)
o Unde   erestimating the time, co and effort for the initia introduction of a tool (in
                                   ost                        al            n                         ning
                                                                                         ncluding train
    and external exp pertise)
o Unde   erestimating the time and effort need to achiev significant and continu
                                    d            ded           ve           t            uing benefits from
        tool (including the need fo changes in the testing process and continuous improvement of
    the t                           or                        g              d           s
        way
    the w the tool is used)
o Unde   erestimating the effort required to ma                                           he
                                                 aintain the test assets generated by th tool
o Over-reliance on the tool (rep
                     n                           or            n
                                    placement fo test design or use of au    utomated tes sting where
    manual testing w would be bett  ter)
o Neglecting versio control of test assets w
                      on           f              within the toool
o Neglecting relatio                              ility
                      onships and interoperabi issues be        etween critic tools, such as requirem
                                                                            cal                        ments
    management too version c
                     ols,          control tools, incident management to                  racking tools and
                                                                            ools, defect tr           s
    tools from multip vendors
        s            ple
o Risk of tool vend going out of business, retiring the tool, or sellin the tool to a different
       k             dor           t                                         ng          o
    venddor
        r
o Poor response fr                 for
                      rom vendor f support, u                 nd
                                                 upgrades, an defect fixe   es
       k
o Risk of suspension of open-s     source / free tool project
o Unfo               ch
        oreseen, suc as the inab    bility to support a new plaatform

6.2.2                  Considera
               Special C                         pes    ols
                               ations for Some Typ of Too (K1)
Test Exeecution Too  ols
       ecution tools execute test objects usin automated test scripts This type o tool often
Test exe                           t            ng           d        s.        of
requires significant e             r            significant be
                     effort in order to achieve s            enefits.

Capturin tests by re
       ng                                                  er
                    ecording the actions of a manual teste seems attractive, but t               h
                                                                                    this approach does
not scale to large numbers of aut
        e                        tomated test scripts. A ca             pt           representatio
                                                           aptured scrip is a linear r           on
with specific data and actions as part of each script. This type of scrip may be uns
                     d                       h                          pt           stable when
unexpeccted events ooccur.


        2011
Version 2                                              Page 62 of 78                                ar-2011
                                                                                                31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s


A data-d             g                        t                       a),         to
        driven testing approach separates out the test inputs (the data usually int a spreadsheet,
         s
and uses a more gen                ript       read the inpu data and e
                      neric test scr that can r           ut                      same test script
                                                                      execute the s
         erent data. Testers who a not famili with the scripting lang
with diffe                         are        iar         s           guage can then create the test
                                                                                              e
data for these predef              .
                      fined scripts.

There ar other techniques employed in data
         re                                 a-driven techniques, wher instead of hard-coded data
                                                                        re         f
combina                          sheet, data is generated using algorit
        ations placed in a spreads                                                  on
                                                                        thms based o configuraable
parameters at run tim and supplied to the ap
                     me                                    or
                                             pplication. Fo example, a tool may use an algorit thm,
which ge             andom user I and for re
         enerates a ra           ID,                       n            seed is empl
                                             epeatability in pattern, a s           loyed for
         ng
controllin randomne  ess.

In a keyw            testing appro
         word-driven t           oach, the spr
                                             readsheet co
                                                        ontains keyw           bing the actio to
                                                                   words describ             ons
        n           d           ds),                     s          ey          miliar with th
be taken (also called action word and test data. Testers (even if the are not fam            he
scripting language) c then define tests usin the keywo
                    can                      ng                     can         ed
                                                        ords, which c be tailore to the
application being tes
                    sted.

Technica expertise in the scriptin language is needed fo all approac
        al                       ng                    or          ches (either by testers or by
                                                                                            r
        sts         tomation).
specialis in test aut

Regardle of the sc
        ess                    nique used, th expected results for e
                  cripting techn            he                                 ed          ed
                                                                   each test nee to be store for
        mparison.
later com

       Analysis Too
Static A             ols
        nalysis tools applied to so
Static an                                      can        c                     applied to exi
                                  ource code c enforce coding standards, but if a            isting
        ay                          tity
code ma generate a large quant of messa                   ng       s
                                               ages. Warnin messages do not stop the code froom
        anslated into an executab program, but ideally should be addressed so t
being tra                         ble                     s                     that maintena ance
        ode                       e.                       ion                 with         ers
of the co is easier in the future A gradual implementati of the analysis tool w initial filte to
exclude some messa   ages is an ef              oach.
                                   ffective appro

Test Maanagement T   Tools
Test management to  ools need to i  interface with other tools or spreadsh
                                                 h                       heets in order to produce useful
information in a format that fits th needs of the organizat
                                   he                         tion.




        2011
Version 2                                              Page 63 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s




6.3 Introdu       Tool into an Org
          ucing a T       o      ganizatio
                                         on                                                   tes
                                                                                       15 minut
(K1)
Terms
No specific terms.

     round
Backgr
The main considerat              cting a tool fo an organiz
                    tions in selec             or                        e:
                                                            zation include
o Asse essment of o              al
                    organizationa maturity, st  trengths and weaknesses and identif
                                                            d                          fication of
   oppoortunities for an improved test proces supported by tools
                                 d             ss
                    nst
o Evaluation again clear requ                  nd           c
                                 uirements an objective criteria
o A pro             ept,
       oof-of-conce by using a test tool during the eva                   se           sh
                                                            aluation phas to establis whether it    t
       orms effectiv
   perfo            vely with the software und test and within the cu
                                                der                                    ructure or to
                                                                         urrent infrastr
                                  hat
   identify changes needed to th infrastruc                 ctively use th tool
                                               cture to effec            he
o Evaluation of the vendor (inc
                    e                          ng,          a             cial        )
                                 cluding trainin support and commerc aspects) or service support   s
   supp             e
       pliers in case of non-commmercial tools  s
o Identification of internal requiirements for coaching an mentoring in the use o the tool
                                                            nd                        of
o Evaluation of training needs considering the current test team’s te automatio skills
                                                                         est           on
o Estimmation of a c             ratio based o a concrete business ca
                    cost-benefit r             on           e             ase

Introducing the selec                         ation starts with a pilot pro
                     cted tool into an organiza             w             oject, which has the followwing
objectivees:
         rn         ail
o Lear more deta about the t        tool
o Evaluate how the tool fits with existing pr
                      e                       rocesses and practices, a determin what would
                                                             d            and          ne
     need to change
         d
o Deci on standard ways of using, mana
         ide                                                 g                         ol
                                              aging, storing and maintaining the too and the tes   st
         ets
     asse (e.g., dec ciding on namming convent               s            creating libraries and defining
                                               tions for files and tests, c
         modularity of test suites)
     the m            f
o Asse whether the benefits will be achie
         ess                                  eved at reaso  onable cost

Success factors for the deployme of the too within an organization include:
      s                        ent           ol          o
o Rolling out the to to the res of the orga
                    ool        st                        rementally
                                            anization incr
o Adap pting and improving proccesses to fit w the use of the tool
                                             with
o Prov               g                      ng
      viding training and coaching/mentorin for new us   sers
o Definning usage g guidelines
                    way        er                       m
o Implementing a w to gathe usage information from the actual u     use
o Monitoring tool u and bene
                   use         efits
o Prov               rt         t
      viding suppor for the test team for a ggiven tool
o Gath              ns          om
      hering lesson learned fro all teams    s

     ences
Refere
       uwalda, 2001 Fewster, 1
6.2.2 Bu           1,        1999
6.3 Fewwster, 1999




        2011
Version 2                                              Page 64 of 78                              ar-2011
                                                                                              31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                   International
      fied Teste
 Certif        er                                                                Software Te esting
      ation Level Sy
 Founda            yllabus                                                      Qualifications Board
                                                                                Q            s




7.                 nces
             Referen
    dards
Stand
      Glossary of T
ISTQB G                                  Testing Versi 2.1
                  Terms used in Software T           ion

       Chrissis, M.B Konrad, M and Shrum S. (2004) CMMI, Guidelines for Pro
[CMMI] C           B.,        M.       m,                                            ration
                                                                          ocess Integr
       duct Improve
and Prod           ement, Addis Wesley: Reading, MA
                              son                 A
See Secction 2.1
       td
[IEEE St 829-1998] IEEE Std 82                         ard       ware Test Doc
                                  29™ (1998) IEEE Standa for Softw           cumentation,
See Sec              4,          5.3,
       ctions 2.3, 2.4 4.1, 5.2, 5 5.5, 5.6
       028] IEEE St 1028™ (20
[IEEE 10          td                               S          views and Au
                            008) IEEE Standard for Software Rev          udits,
See Secction 3.2
       2207] IEEE 1
[IEEE 12                     EC       008, Software life cycle processes,
                  12207/ISO/IE 12207-20           e
See Secction 2.1
       26]       9126-1:2001 Software E
[ISO 912 ISO/IEC 9         1,                                Product Quality,
                                      Engineering – Software P
See Secction 2.3


    s
Books
                      r,
[Beizer, 1990] Beizer B. (1990) S             sting Techniq
                                   Software Tes                      dition), Van N
                                                          ques (2nd ed            Nostrand Reinhold:
Boston
See Sec               3,          4.3,
        ctions 1.2, 1.3 2.3, 4.2, 4 4.4, 4.6
        2001] Black, R. (2001) Ma
[Black, 2                          anaging the Testing Proc            ition), John W
                                                           cess (3rd edi                       s:
                                                                                    Wiley & Sons New
York
See Sec               2,          2.3,                     ,
        ctions 1.1, 1.2 1.4, 1.5, 2 2.4, 5.1, 5.2, 5.3, 5.5, 5.6
       a,        walda, H. et a (2001) Int
[Buwalda 2001] Buw            al.        tegrated Test Design and Automation Addison Wesley:
                                                                d          n,        W
Reading, MA
See Sec
      ction 6.2
       nd,
[Copelan 2004] Co                2004) A Prac
                    opeland, L. (2                       uide to Softw
                                            ctitioner’s Gu                       sign, Artech
                                                                     ware Test Des
       Norwood, MA
House: N             A
See Sec              3,          4.4,
       ctions 2.2, 2.3 4.2, 4.3, 4 4.6
        2002] Craig, R D. and J
[Craig, 2             Rick                       fan      )                      esting, Artech
                                    Jaskiel, Stef P. (2002) Systematic Software Te            h
House: NNorwood, MA   A
See Sec               2.1.3, 2.4, 4.1, 5.2.5, 5.3, 5.4
        ctions 1.4.5, 2
[Fewster 1999] Fewster, M. and Graham, D. (1999) Softw
       r,                                                       utomation, A
                                                     ware Test Au                     ley:
                                                                           Addison Wesl
Reading, MA
See Sec              3
       ctions 6.2, 6.3
[Gilb, 1993]: Gilb, To and Graham, Dorothy (1993) Software Inspection, Addison Wesley:
                     om                  y                                   n
Reading, MA
See Sec               3.2.4
        ctions 3.2.2, 3
[Hetzel, 1988] Hetzel, W. (1988) Complete G              ware Testing QED: Welle
                                            Guide to Softw          g,         esley, MA
See Sec               4,          2.2,        4.1,
        ctions 1.3, 1.4 1.5, 2.1, 2 2.3, 2.4, 4 5.1, 5.3
        2002] Kaner, C., Bach, J. and Petttico B. (2002 Lessons L
[Kaner, 2            ,                       ord,     2)                    oftware Testi
                                                                Learned in So           ing,
        ley
John Wil & Sons: N   New York
See Sec              5,
       ctions 1.1, 4.5 5.2

        2011
Version 2                                              Page 65 of 78                         ar-2011
                                                                                         31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                   International
      fied Teste
 Certif        er                                                                Software Te esting
      ation Level Sy
 Founda            yllabus                                                      Qualifications Board
                                                                                Q            s


[Myers 1979] Myers, Glenford J. (1979) The A of Softwa Testing, J
                                           Art       are                             w
                                                                John Wiley & Sons: New York
See Sec              3,
       ctions 1.2, 1.3 2.2, 4.3
       enendaal, 20
[van Vee                       enendaal, E. (ed.) (2004) The Testing Practitioner (Chapters 6, 8,
                    004] van Vee                                   g            r           6
       N
10), UTN Publishers: The Netherrlands
See Sec              3
       ctions 3.2, 3.3




        2011
Version 2                                              Page 66 of 78                        ar-2011
                                                                                        31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                          International
      fied Teste
 Certif        er                                                                       Software Te esting
      ation Level Sy
 Founda            yllabus                                                             Qualifications Board
                                                                                       Q            s




8.           A    dix   Syllabus Backg
             Append A – S            ground
     ry         Documen
History of this D     nt
This doc cument was p  prepared bet tween 2004 a 2011 by a Working G
                                                 and        y             Group compr  rised of memmbers
         ed
appointe by the Inte   ernational So                         tions Board (
                                    oftware Testing Qualificat                         was
                                                                           (ISTQB). It w initially
          d            ted
reviewed by a select review pa                  en
                                    anel, and the by represe entatives draawn from the internationaal
         e
software testing com  mmunity. The rules used in the produc               document are shown in
                                                            ction of this d            e
Appendix C.
This doc               e            r
         cument is the syllabus for the Internattional Founda             cate in Software Testing, the
                                                             ation Certific
first level internationa qualificatio approved by the ISTQ (www.istqb.org).
                       al           on                      QB

               he    dation Ce
Objectives of th Found                  Q        tion
                             ertificate Qualificat
o     To g              ion
          gain recogniti for testing as an esse                  ofessional so
                                                    ential and pro                        neering
                                                                              oftware engin
      speccialization
o     To pprovide a stanndard framew                            nt
                                       work for the developmen of testers' c   careers
o     To eenable profes ssionally quaalified testers to be recognized by employers, customers and peers,
                                                   s                                                    p
      and to raise the pprofile of testers
o     To ppromote cons  sistent and good testing p  practices within all softwa engineer
                                                                              are                       es
                                                                                          ring discipline
o         dentify testing topics that are relevant and of value to industry
      To id                           t             t                         y
o     To eenable softwa suppliers to hire certified testers and thereby g
                        are          s                           a             gain commercial advanta  age
          r
      over their compe etitors by adv                           uitment policy
                                     vertising their tester recru              y
o     To pprovide an oppportunity for testers and those with an interest in testing to ac
                                      r                          a                        cquire an
          rnationally re
      inter            ecognized qu  ualification in the subject


               he              Qualificatio (adapted from ISTQB
Objectives of th International Q          on
meetin at Soll
     ng                 Novembe 2001)
               lentuna, N       er
o         be
      To b able to com                             ss            c
                        mpare testing skills acros different countries
o     To eenable testers to move ac                y             ore
                                     cross country borders mo easily
o     To eenable multin              national projects to have a common u
                       national/intern                                         understandin of testing issues
                                                                                          ng
o     To inncrease the nnumber of qu              ers
                                     ualified teste worldwide    e
o     To hhave more im              as
                       mpact/value a an interna   ationally-base initiative than from any country-specific
                                                                 ed                        y
      appr roach
o     To ddevelop a com mmon international body of understanding and kn
                                                  y                                      bout testing
                                                                               nowledge ab
           ugh
      throu the sylla  abus and term                             e             f
                                    minology, and to increase the level of knowledge about testing for   g
           articipants
      all pa
o     To ppromote testing as a profe              ore
                                     ession in mo countries
o     To eenable testers to gain a reecognized qu  ualification in their native language
                                                                 n
o     To e             ng            dge
          enable sharin of knowled and reso                      ss
                                                  ources acros countries
o     To p             national reco
          provide intern            ognition of tessters and this qualification due to par
                                                                  s                       rticipation from
      many countries


             ments for this Qua
Entry Requirem       r        alification
        ry          or
The entr criterion fo taking the ISTQB Foun               ficate in Softw
                                             ndation Certif                         g          on
                                                                         ware Testing examinatio is
that cand           e                         testing. Howe
         didates have an interest in software t                          ongly recommended that
                                                          ever, it is stro                      t
        tes
candidat also:
         e
o Have at least a m minimal back                          e             ent                    uch
                                 kground in either software developme or software testing, su as
        months experience as a s
    six m                                    ser          ce             as         e
                                 system or us acceptanc tester or a a software developer

        2011
Version 2                                              Page 67 of 78                                ar-2011
                                                                                                31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s


o     Take a course th has been accredited t ISTQB sta
          e          hat                   to        andards (by o of the IS
                                                                 one                 nized
                                                                           STQB-recogn
          onal Boards)
      Natio          ).


Backg        nd      y        Foundatio Certific
     ground an History of the F       on                oftware
                                               cate in So
     ng
Testin
The indeependent cer                software test
                     rtification of s                         n            h
                                                ters began in the UK with the British CComputer
         s          n
Society's Information Systems Ex    xamination B              ),                       ng
                                                Board (ISEB) when a Software Testin Board wa set   as
         98
up in 199 (www.bcs  s.org.uk/iseb). In 2002, A ASQF in Germ   many began to support a German tes    ster
qualification scheme (www.asqf.d This syll
                                    de).                      ed            EB        QF
                                                 labus is base on the ISE and ASQ syllabi; it
includes reorganized updated an additional content, and the empha is directe at topics that
                    d,              nd           l            d            asis        ed
will provide the most practical help to testers.
                     t                          .
An existi Foundation Certificate in Software Testing (e.g., from ISEB, ASQF or a ISTQB-
         ing                        e            e                                     an
        zed
recogniz National Board) awar                   this Internatio
                                    rded before t                           ate
                                                              onal Certifica was relea ased, will be
deemed to be equiva alent to the In                           T                        te           e
                                    nternational Certificate. The Foundation Certificat does not expire
and does not need to be renewed The date i was awarded is shown on the Certificate.
         s           o              d.           it
         ach
Within ea participa                y,
                    ating country local aspec are contro
                                                cts           olled by a na            B-recognized
                                                                           ational ISTQB            d
         e            ard.          of
Software Testing Boa Duties o National Boards are sp                       he          ut
                                                              pecified by th ISTQB, bu are implem  mented
         ach                       of           y                          o
within ea country. The duties o the country boards are expected to include accreditation of
         providers and the setting of exams.
training p                         g




        2011
Version 2                                              Page 68 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                        International
      fied Teste
 Certif        er                                                                     Software Te esting
      ation Level Sy
 Founda            yllabus                                                           Qualifications Board
                                                                                     Q            s




   Append B – L
9. A      dix       ng    ctives/C
              Learnin Objec             ve
                                 Cognitiv Level of
   wledge
Know
        owing learnin objectives are defined as applying to this syllab
The follo           ng                                                            pic
                                                                      bus. Each top in the syllabus
         xamined acc
will be ex                       e           bjective for it.
                    cording to the learning ob              .

                    1)
Level 1: Remember (K1
       didate will re
The cand                        member and recall a term or concept.
                    ecognize, rem                      m           .
Keyword Rememb retrieve, recall, recog
       ds:          ber,                  gnize, know

Examplee
       ognize the de
Can reco                           ailure” as:
                    efinition of “fa
o “Non n-delivery of service to an end user or any other stakeholder” or
                                  n                        s
       ual         n
o “Actu deviation of the comp     ponent or sysstem from its expected delivery, service or result”
                                                           s                                     ”

             rstand (K2
Level 2: Under        2)
The cand didate can se            asons or expl
                      elect the rea             lanations for statements related to the topic, and can
                                                                                      e
                     e,
summarize, compare classify, ca                 d
                                  ategorize and give examples for the t testing conce ept.
Keyword Summar
         ds:         rize, generali             ,                        ,
                                   ize, abstract, classify, compare, map, contrast, ex             erpret,
                                                                                      xemplify, inte
        e,
translate represent, infer, concluude, categorize, construct models

Examplees
      plain the reas why tests should be d
Can exp            son         s            designed as early as pos
                                                                   ssible:
o To find defects w            e            o
                   when they are cheaper to remove
o To find the most important deefects first

Can exp                          differences between integ
      plain the similarities and d                                    system testin
                                                          gration and s            ng:
o Similarities: testing more than one compo
                                 n                         an
                                              onent, and ca test non-ffunctional asspects
o Diffe                          ng
      erences: integration testin concentra               faces and int
                                             ates on interf                         nd       esting
                                                                      teractions, an system te
      centrates on whole-system aspects, s
   conc                                       such as end-            essing
                                                          -to-end proce

Level 3: Apply (K3)
The cand             elect the cor
         didate can se           rrect applicat
                                              tion of a conc
                                                           cept or techn
                                                                       nique and ap               ven
                                                                                   pply it to a giv
context.
Keyword Impleme execute, use, follow a procedure, apply a proc
         ds:        ent,                                               cedure
Example  e
o Can identify boundary values for valid and invalid par
                                 s                         rtitions
o Can select test c cases from a given state transition diaagram in order to cover a transitions
                                                                                   all           s

              ze
Level 4: Analyz (K4)
The cand didate can se eparate infor rmation relat to a proce
                                                  ted                       hnique into it constituen parts
                                                                edure or tech            ts          nt
for better understand               n                          cts
                      ding, and can distinguish between fac and infere                  cal
                                                                            ences. Typic application is to
analyze a document, software or project situa
                       ,            r             ation and proopose approp              s
                                                                            priate actions to solve a
problem or task.
Keyword Analyze, organize, fin coherenc integrate, outline, pars structure, attribute,
         ds:           ,             nd          ce,                        se,          ,
deconstr                            minate, disting
         ruct, differentiate, discrim                          ,
                                                   guish, focus, select



        2011
Version 2                                              Page 69 of 78                              ar-2011
                                                                                              31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                  International
      fied Teste
 Certif        er                                                               Software Te esting
      ation Level Sy
 Founda            yllabus                                                     Qualifications Board
                                                                               Q            s


      e
Example
o Anallyze product risks and propose preve entive and co              gation activit
                                                        orrective mitig            ties
o Desc            portions of an incident report are factual and whic are inferre from results
      cribe which p            n                                     ch           ed

Refere
     ence
                     vels of learning objectives
(For the cognitive lev                         s)
Anderso L. W. and Krathwohl, D. R. (eds) (2001) A Tax
       on,                                                                 aching, and
                                                    xonomy for Learning, Tea
Assessin A Revisio of Bloom's Taxonomy of Education Objective Allyn & Bacon
        ng:          on            s          y      nal       es,




        2011
Version 2                                              Page 70 of 78                       ar-2011
                                                                                       31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                       International
      fied Teste
 Certif        er                                                                    Software Te esting
      ation Level Sy
 Founda            yllabus                                                          Qualifications Board
                                                                                    Q            s




    Append C – R
10. A    dix         Applied to the ISTQB
               Rules A
    dation Sy
Found       yllabus
                                            opment and review of this syllabus. (A “TAG” is sh
The rules listed here were used in the develo            r          s            A           hown
after eac rule as a s
        ch                      bbreviation o the rule.)
                     shorthand ab           of

10.1.1 General Rules
SG1. The syllabus sh   hould be und            e
                                   derstandable and absorb               ple         o          hs
                                                           bable by peop with zero to six month (or
more) exxperience in testing. (6-MMONTH)
SG2. The syllabus sh              actical rather than theoret
                       hould be pra                         tical. (PRACTTICAL)
SG3. The syllabus sh               ar
                       hould be clea and unam                ts
                                               mbiguous to it intended rreaders. (CLE EAR)
SG4. The syllabus sh   hould be und            e
                                   derstandable to people frrom different countries, and easily
translata             erent languag
        able into diffe            ges. (TRANS SLATABLE)
SG5. The syllabus sh   hould use Ammerican English. (AMERICAN-ENGLISH)

               Content
10.1.2 Current C
SC1. The syllabus sh             e            ing        s                          ent
                     hould include recent testi concepts and should reflect curre best pract      tices
in softwa testing where this is g
        are                      generally agrreed. The syllabus is sub             w             e
                                                                      bject to review every three to
        rs.
five year (RECENT   T)
SC2. The syllabus sh             ize          ted        s            ent
                     hould minimi time-relat issues, such as curre market co        onditions, to
enable it to have a sh life of thr to five ye
        t            helf        ree         ears. (SHELF F-LIFE).

              g         es
10.1.3 Learning Objective
LO1. Lea arning object                         between item to be reco
                      tives should distinguish b          ms            ognized/reme  embered (cognitive
         ),
level K1) items the ccandidate should underst              tually (K2), it
                                                tand concept             tems the canndidate should be
        practice/use (
able to p                                                               use          ze
                     (K3), and items the candidate should be able to u to analyz a document,
software or project situation in co
        e                         ontext (K4). (KNOWLEDG   GE-LEVEL)
LO2. The description of the conte should be consistent with the lear
         e           n             ent          e                                     ves. (LO-
                                                                         rning objectiv
CONSIS  STENT)
LO3. To illustrate the learning ob
                     e                         mple exam questions for each major s
                                  bjectives, sam                                                 uld
                                                                                      section shou be
issued a             e            LO-EXAM)
        along with the syllabus. (L

               Structure
10.1.4 Overall S
         e           of
ST1. The structure o the syllabus should be clear and allow cross-ref                        her
                                                                     ferencing to and from oth
         om
parts, fro exam que               from other re
                      estions and f                                 OSS-REF)
                                              elevant documents. (CRO
ST2. Overlap betwee sections o the syllabu should be minimized. (OVERLAP)
                      en          of          us
ST3. Eac section of the syllabus should hav the same structure. (STRUCTURE
          ch          f           s           ve          s                      E-CONSISTE  ENT)
         e
ST4. The syllabus sh              n           ate        a
                     hould contain version, da of issue and page num              ry
                                                                     mber on ever page.
(VERSIO  ON)
         e
ST5. The syllabus sh             e                        unt        o           n           on
                     hould include a guideline for the amou of time to be spent in each sectio (to
         he          mportance of each topic). (TIME-SPEN
reflect th relative im                                    NT)

     ences
Refere
SR1. So ources and re             ll          or           n            us
                     eferences wil be given fo concepts in the syllabu to help training provide  ers
         more informa
find out m           ation about the topic. (REEFS)
SR2. Wh              re          y             nd
         here there ar not readily identified an clear sources, more d  detail should be provided in the
                     le,          s
syllabus. For exampl definitions are in the G  Glossary, so only the term are listed in the syllab
                                                                        ms                       bus.
(NON-REF DETAIL)



        2011
Version 2                                              Page 71 of 78                             ar-2011
                                                                                             31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                      International
      fied Teste
 Certif        er                                                                   Software Te esting
      ation Level Sy
 Founda            yllabus                                                         Qualifications Board
                                                                                   Q            s


     es       rmation
Source of Infor
                   yllabus are defined in the ISTQB Glos
Terms used in the sy                        e                      ms        Software Test
                                                       ssary of Term used in S           ting. A
version o the Glossa is availab from ISTQ
        of         ary         ble          QB.

          recommende books on software tes
A list of r         ed                                                rallel with this syllabus. The
                                            sting is also issued in par              s           T
          ok        t            rences sectio
main boo list is part of the Refer           on.




        2011
Version 2                                              Page 72 of 78                            ar-2011
                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                     International
      fied Teste
 Certif        er                                                                  Software Te esting
      ation Level Sy
 Founda            yllabus                                                        Qualifications Board
                                                                                  Q            s




11.          Appendi D – No
             A     ix              Training Provider
                          otice to T               rs
         ajor
Each ma subject h               he
                    heading in th syllabus is assigned an allocated ti
                                            s             n                        es.
                                                                       ime in minute The purp pose of
         oth        uidance on th relative pr
this is bo to give gu            he                       t                        ach         o
                                             roportion of time to be allocated to ea section of an
         ed                     n           te            t
accredite course, and to give an approximat minimum time for the t    teaching of e            .
                                                                                   each section.
Training providers may spend mo time than is indicated and candidates may spend more tim
                                 ore        n            d                                     me
again in reading and research. A course curriiculum does not have to follow the sa             s
                                                                                   ame order as the
syllabus.
The syllaabus contains references to establish standards, which mus be used in the prepara
                                 s           hed                      st          n            ation
         ng          Each standar used mus be the vers
of trainin material. E            rd         st          sion quoted in the current version of this
                                                                                   t
                                             andards not referenced in this syllabus may also be
syllabus. Other publications, templates or sta           r           n                         b
used and referenced but will not be examine
          d         d,           t           ed.
        nd                    es          practical exe
All K3 an K4 Learning Objective require a p                         ncluded in th training
                                                      ercise to be in           he
        s.
materials




        2011
Version 2                                              Page 73 of 78                          ar-2011
                                                                                          31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                              International
      fied Teste
 Certif        er                                                                           Software Te esting
      ation Level Sy
 Founda            yllabus                                                                 Qualifications Board
                                                                                           Q            s



12.          Appendi E – Re
             A     ix             otes
                          elease No

Release 2010
      1. CChanges to Learning Ob                                   me
                                     bjectives (LO) include som clarificatio     on
                   a. W Wording cha                 e              Os            and
                                     anged for the following LO (content a level of L remains  LO
                                      :
                        unchanged): LO-1.2.2, LO-1.3.1, LO-         -1.4.1, LO-1..5.1, LO-2.1.1, LO-2.1.3, LO-
                        2.4.2, LO-4.1 LO-4.2.1 LO-4.2.2, LO-4.3.1, LO
                        2             1.3,          1,                          O-4.3.2, LO-4 4.3.3, LO-4.4 4.1,
                        LO-4.4.2, LO O-4.4.3, LO-4                 1.2,         2,
                                                    4.6.1, LO-5.1 LO-5.2.2 LO-5.3.2, L        LO-5.3.3, LO O-
                        5             6.1,          1,
                        5.5.2, LO-5.6 LO-6.1.1 LO-6.2.2, LO-6.3.2.
                                      s
                   b. LO-1.1.5 has been rewor        rded and upg               2.
                                                                    graded to K2 Because a comparison of    n
                        terms of defe related te
                        t             ect           erms can be expected.
                                     2)
                   c. LO-1.2.3 (K2 has been a        added. The content was already cov
                                                                                s                           2
                                                                                              vered in the 2007
                        syllabus.
                        s
                   d. LO-3.1.3 (K2 now comb
                                     2)            bines the con  ntent of LO-3.1.3 and LO-    -3.1.4.
                   e. LO-3.1.4 has been removed from the 2010 syllab
                                      s                            e                          partially redun
                                                                                bus, as it is p             ndant
                        w LO-3.1.3.
                        with
                   f. LO-3.2.1 has been rewor
                                      s              rded for cons              h
                                                                    sistency with the 2010 sy yllabus conte ent.
                   g. LO-3.3.2 has been modif
                                      s                             l            en
                                                     fied, and its level has bee changed from K1 to K2, forK
                        consistency with LO-3.1.2.
                        c
                   h. LO 4.4.4 has been modif
                                      s              fied for clarity and has been changed from a K3 to a
                                                                    y,                         d             t
                                     n:                           b
                        K4. Reason LO-4.4.4 had already been written i a K4 mann in            ner.
                   i. LO-6.1.2 (K1 was dropp from the 2010 syllabu and was r
                                     1)             ped                         us                          h
                                                                                              replaced with LO-
                        6
                        6.1.3 (K2). T There is no L                 he
                                                    LO-6.1.2 in th 2010 sylla   abus.
      2. C             use
          Consistent u for test approach acc                       e            n              y.
                                                   cording to the definition in the glossary The term test   t
          strategy will not be required as term t recall.
          s                                          to
      3. CChapter 1.4 now contains the concep of traceability between test basis and test cases.
                                                    pt
      4. CChapter 2.x now contains test objects and test ba
                                      s              s             asis.
      5. R             s             ain
          Re-testing is now the ma term in the glossary in         nstead of connfirmation tes sting.
          The
      6. T aspect d    data quality a testing h been add at severa locations in the syllabu
                                     and            has            ded           al                         us:
          data quality a risk in C
          d             and                         5.5,
                                    Chapter 2.2, 5 6.1.8.
      7. CChapter 5.2.3 Entry Crite are adde as a new subchapter. Reason: Consistency to Exit
                                     eria           ed              s
          C            entry criteria added to LO
          Criteria (-> e                           O-5.2.9).
      8. C             use
          Consistent u of the ter    rms test strat tegy and test approach w their definition in the
                                                                    t           with
          glossary.
          g
      9. CChapter 6.1 shortened be    ecause the t  tool descriptions were too large for a 45 minute le
                                                                                o                          esson.
      10. IEEE Std 829:2008 has b                   ed.
                                      been release This vers      sion of the syyllabus does not yet consider
          this new edit
          t             tion. Section 5.2 refers to the docume Master Test Plan. The content of the
                                                    o              ent                         e
          Master Test Plan is cove
          M                          ered by the co                 t           nt             ”
                                                     oncept that the documen “Test Plan” covers diffe       erent
          levels of plan
          l             nning: Test p                               an          ed
                                     plans for the test levels ca be create as well as a test plan on the   o
          project level covering mu
          p                                         vels. Latter is named Master Test Pla in this syllabus
                                     ultiple test lev               s                         an
          a in the IS
          and          STQB Glossa   ary.
          Code of Ethics has been moved from the CTAL to CTFL.
      11. C                                        m                o


Release 2011
Changes made with the “mainten
      s                                    se”
                                nance releas 2011
      General: Wo
   1. G           orking Party r            Working Gro
                                replaced by W           oup
      Replaced po
   2. R                         s
                  ost-conditions by postcon              der
                                           nditions in ord to be con nsistent with the ISTQB
      Glossary 2.1.
      G
      First occurre
   3. F           ence: ISTQB replaced by ISTQB®
   4. Introduction to this Syllab           tions of Cogn
                                bus: Descript                        s            ge
                                                         nitive Levels of Knowledg removed,
      because this was redund
      b           s             dant to Appendix B.
        2011
Version 2                                              Page 74 of 78                                     ar-2011
                                                                                                     31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                            International
      fied Teste
 Certif        er                                                                         Software Te esting
      ation Level Sy
 Founda            yllabus                                                               Qualifications Board
                                                                                         Q            s


      5. S                           e                          e                            r
          Section 1.6: Because the intent was not to define a Learning Objective for the “Code of        o
          Ethics”, the c
          E                           el
                        cognitive leve for the sec               en
                                                  ction has bee removed.
      6. SSection 2.2.1 2.2.2, 2.2.3 and 2.2.4, 3.2.3: Fixed formatting is
                        1,                                                                  s.
                                                                               ssues in lists
      7. S              2
          Section 2.2.2 The word f   failure was no correct for “…isolate fa
                                                   ot            r                          specific comp
                                                                               ailures to a s            ponent
                        re           with
          …”. Therefor replaced w “defect” in that sente         ence.
      8. SSection 2.3: Corrected fo ormatting of b               t            es
                                                  bullet list of test objective related to test terms in n
          section Test Types (K2).
          s
      9. S              4:
          Section 2.3.4 Updated d   description of debugging to be consistent with Ver
                                                   f                                        rsion 2.1 of the
          ISTQB Gloss   sary.
          Section 2.4 r
      10. S                           rd          e”
                        removed wor “extensive from “inclu                     ve            n
                                                                udes extensiv regression testing”,
          because the “extensive” depends on the change (size, risks, v
          b                                                                                 as            t
                                                                               value, etc.) a written in the
          next sentenc
          n             ce.
          Section 3.2: The word “in
      11. S                                       s               ved         y
                                     ncluding” has been remov to clarify the sentenc        ce.
          Section 3.2.1 Because t activities of a formal review had b
      12. S             1:           the                         r                          ctly
                                                                              been incorrec formatted thed,
          r            ess
          review proce had 12 m     main activities instead of six, as intend
                                                  s              s                           een         d
                                                                               ded. It has be changed back
          to
          t six, which makes this s   section comp              he
                                                   pliant with th Syllabus 2               e
                                                                              2007 and the ISTQB Advanced
          Level Syllabu 2007.
          L             us
          Section 4: W
      13. S           Word “develop                              d”                         et
                                     ped” replaced by “defined because test cases ge defined an not      nd
          developed.
          d
          Section 4.2: Text change to clarify ho black-box and white-b testing co
      14. S                         e             ow             x            box                        d
                                                                                             ould be used in
          conjunction w experien
          c             with         nce-based te echniques.
          Section 4.3.5 text change “..between actors, inclu
      15. S             5             e                                        and
                                                                 uding users a the syste    em..” to “ …
          b             ors          r
          between acto (users or systems), … “.
          Section 4.3.5 alternative path replace by alterna
      16. S             5                         ed            ative scenario o.
          Section 4.4.2 In order to clarify the te branch testing in the text of Sect
      17. S             2:          o             erm            t            e             tion 4.4, a
          sentence to clarify the focus of branc testing has been chang
          s                                      ch              s             ged.
          Section 4.5, Section 5.2.6: The term “experienced
      18. S                                                      d-based” tes               en           b
                                                                              sting has bee replaced by the
          correct term “experience-based”.
          c
          Section 6.1: Heading “6.1.1 Understa
      19. S                                                    M
                                                  anding the Meaning and Purpose of T                    t
                                                                                            Tool Support for
          Testing (K2)” replaced by “6.1.1 Tool Support for Testing (K2)”.
          T                           y            l
      20. S
          Section 7 / BBooks: The 3 edition of [Black,2001] listed, repla
                                    3rd                                        acing 2nd edittion.
          Appendix D: Chapters re
      21. A                         equiring exerc              b
                                                   cises have been replaced by the gen                   ment
                                                                                            neric requirem
          that all Learn
          t                         ves
                        ning Objectiv K3 and h                   e
                                                  higher require exercises. This is a req   quirement specified
          in           B             on
          i the ISTQB Accreditatio Process (      (Version 1.26  6).
          Appendix E: The change learning objectives bet
      22. A                         ed                           tween Versio 2007 and 2010 are no
                                                                              on                         ow
          correctly liste
          c             ed.




        2011
Version 2                                              Page 75 of 78                                  ar-2011
                                                                                                  31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                                   International
      fied Teste
 Certif        er                                                                                                Software Te esting
      ation Level Sy
 Founda            yllabus                                                                                      Qualifications Board
                                                                                                                Q            s




13.          Index
action word ..............    ................................ 63           dy                ng
                                                                              ynamic testin ..................... 13, 31, 32, 36     3
alpha tes  sting ............ .......................... 24, 27             emmergency ch     hange ................................. 30
architect  ture .............. 15, 21, 22, 25, 28, 29
                              ..                                            ennhancement .................................... 27, 30 2
          g
archiving ..................  .......................... 17, 30               ntry
                                                                            en criteria ..     ........................................... 33
automation ...............    ................................ 29           eqquivalence p    partitioning .......................... 40
benefits of independe         ence ....................... 47                 rror
                                                                            er ...............                     10,
                                                                                               ................... 1 11, 18, 43, 50  4
benefits of using tool ............................... 62
                              l                                               rror            g
                                                                            er guessing ............................. 18, 43, 50     4
beta test  ting ........................................ 24, 27             exxhaustive tes    sting ................................... 14
         ox
black-bo technique .          .................... 37, 39, 40               ex criteria13, 15, 16, 33, 35, 45, 48, 49, 50,
                                                                              xit              ,                                     4
black-bo test design technique ............. 39
         ox                   n                                                 51
         ox
black-bo testing ......       ................................ 28           ex                ult
                                                                              xpected resu ...................... 16, 38, 48, 63     4
          up
bottom-u .................    ................................ 25           exxperience-ba    ased technique ....... 37, 39, 43      3
boundar value analy ......................... 40
          ry                  ysis                                          exxperience-ba    ased test des       sign techniqu 39   ue
                              ................................ 11
bug ...........................                                             exxploratory tes                                         4
                                                                                                sting ............................. 43, 50
captured script .........
          d                   ................................ 62            actory accept
                                                                            fa                 tance testing ...................... 27
                                                                                                                 g
           ts
checklist .................   .......................... 34, 35              ailure10, 11, 13, 14, 18, 2 24, 26, 32 36,
                                                                            fa                                   21,                  2,
choosing test techniq .......................... 44
          g                   que                                               43, 46, 50, 51, 53, 54, 6         69
code cov   verage .........   ........ 28, 29, 37, 42, 58                    ailure rate .....
                                                                            fa                 ..................................... 50, 51
                                                                                                                                     5
commerc off the sh (COTS) ............ 22
           cial               helf                                           ault
                                                                            fa ............... ............................... 10, 11, 43
          r
compiler ...................  ................................ 36            ault
                                                                            fa attack .....    ........................................... 43
         xity
complex ................      .............. 11, 36, 50, 59                   eld
                                                                            fie testing ....   ..................................... 24, 27
                                                                                                                                     2
compone integratio testing22, 25, 29, 59,
           ent               on                                              ollow-up ........
                                                                            fo                                                       3
                                                                                               ............................... 33, 34, 35
   60                                                                        ormal review .
                                                                            fo                                                       3
                                                                                               ..................................... 31, 33
compone testing22 24, 25, 27, 29, 37, 41,
           ent               2,                  ,                           unctional requ
                                                                            fu                 uirement ...................... 24, 262
   42                                                                        unctional spe
                                                                            fu                ecification ............................ 28
configura  ation management ......... 45, 48, 52                             unctional task ......................................... 25
                                                                            fu                 k
Configur  ration manag       gement tool ............. 58                    unctional test .......................................... 28
                                                                            fu                 t
confirma ation testing.. 13, 15, 16, 21, 28, 29
                              ..                                             unctional test ..................................... 28
                                                                            fu                 ting
contract acceptance testing .................... 27                          unctionality ...
                                                                            fu                                     25,
                                                                                               ............. 24, 2 28, 50, 53, 62    5
control fl ...............
           low                .............. 28, 36, 37, 42                  mpact analysis ........................... 21, 30, 38
                                                                            im                                                       3
coverage 15, 24, 28, 29, 37, 38, 39, 40, 42,
           e                                                                                                     24,
                                                                            incident ... 15, 16, 17, 19, 2 46, 48, 55 58,             5,
          1,
   50, 51 58, 60, 62                                                            59, 62
           e
coverage tool ...........     ................................ 58           incident loggin ....................................... 55
                                                                                              ng
custom-d   developed so       oftware.................... 27                incident mana     agement.................. 48, 55, 58   5
         w
data flow ..................  ................................ 36           incident mana     agement tool ................. 58, 59  5
data-driv approach .............................. 63
          ven                 h                                             incident report ................................... 46, 55
                                                                                                t                                    4
          ven
data-driv testing ...         ................................ 62           independence ............................. 18, 47, 48
                                                                                              e                                      4
           ng
debuggin ................     .............. 13, 24, 29, 58                 informal review ............................ 31, 33, 34
                                                                                               w                                     3
           ng
debuggin tool .........       .......................... 24, 58             inspection ......                                        3
                                                                                               ......................... 31, 33, 34, 35
decision coverage ....        .......................... 37, 42             inspection leader ..................................... 33
decision table testing ........................ 40, 41
                             g                                              integration13, 22, 24, 25, 2 29, 36, 40, 41,
                                                                                                                   27,
decision testing ........     ................................ 42               42, 45, 48, 59, 60, 69
defect10 11, 13, 14, 16, 18, 21, 24, 26, 28,
          0,                                                                integration tes                       25,
                                                                                              sting22, 24, 2 29, 36, 40 45,           0,
   29, 31 32, 33, 34, 35, 36, 37, 39, 40, 41,
          1,                                                                    59, 60, 69
   43, 44 45, 47, 49, 50, 51, 53, 54, 55, 59,
          4,                                                                interoperability testing ............................. 28
                                                                                               y
   60, 69 9                                                                 introducing a t    tool into an o     organization5 64   57,
defect de  ensity...........  .......................... 50, 51             IS 9126 .......
                                                                             SO                                                      3
                                                                                               ......................... 11, 29, 30, 65
defect tra acking tool...     ................................ 59           deevelopment m     model................................. 22
developm   ment .. 8, 11, 12, 13, 14, 18, 21, 22,                             erative-increm
                                                                            ite                mental development mod 22              del
   24, 29 32, 33, 36, 38, 44, 47, 49, 50, 52,
          9,                                                                keeyword-drive approach........................ 63
                                                                                              en
          5,
   53, 55 59, 67                                                            keeyword-drive testing ............................ 62
                                                                                              en
developm   ment model .       .......................... 21, 22                                ........................................... 33
                                                                            kick-off ...........
         cks
drawbac of indepe            endence ................... 47                 le
                                                                             earning objec    ctive ... 8, 9, 10, 21, 31, 37 45,      7,
                              ................................ 24
driver ........................                                                 57, 69, 70, 71
dynamic analysis too ....................... 58, 60
         c                   ol                                              oad
                                                                            lo testing ....                                          5
                                                                                               ............................... 28, 58, 60

        2011
Version 2                                                           Page 76 of 78                                               ar-2011
                                                                                                                            31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                                International
      fied Teste
 Certif        er                                                                                             Software Te esting
      ation Level Sy
 Founda            yllabus                                                                                   Qualifications Board
                                                                                                             Q            s


load test ting tool........   ................................ 58            ecurity testing ........................................ 28
                                                                            se
maintain nability testing ............................. 28
                              g                                              ecurity tool ...
                                                                            se               ..................................... 58, 60
                                                                                                                                   5
maintena   ance testing ......................... 21, 30                    simulators ................................................. 24
management tool .....         .............. 48, 58, 59, 63                 site acceptanc testing ........................... 27
                                                                                             ce
maturity .................... .............. 17, 33, 38, 64                 so
                                                                             oftware deve   elopment ............. 8, 11, 21, 22   2
metric ........................................... 33, 35, 45                oftware deve
                                                                            so              elopment model .................. 22
mistake ....................  .................... 10, 11, 16                pecial consid
                                                                            sp              derations for some types of tool 62
modelling tool...........     ................................ 59            est
                                                                            te case ........ ........................................... 38
moderator ................    .................... 33, 34, 35                pecification-b
                                                                            sp              based technique..... 29, 39, 40        3
monitorin tool .........
           ng                 .......................... 48, 58              pecification-b
                                                                            sp              based testing...................... 37
                                                                                                                g
non-func  ctional requir     rement ......... 21, 24, 26                     takeholders .. 12, 13, 16, 1 26, 39, 45, 54
                                                                            st               .                   18,               4
non-func  ctional testing ....................... 11, 28
                              g                                              tate transition testing ....................... 40, 41
                                                                            st               n                                     4
objective for testing ............................... 13
          es                                                                 tatement cov
                                                                            st              verage ................................ 42
off-the-shelf ..............  ................................ 22            tatement test ..................................... 42
                                                                            st               ting
operational acceptan testing ............... 27
                             nce                                             tatic analysis .................................... 32, 36
                                                                            st              s                                      3
operational test ........     .................... 13, 23, 30                tatic analysis tool ........... 3 36, 58, 59, 63
                                                                            st              s                    31,               5
patch ........................................................ 30            tatic techniqu ................................. 31, 32
                                                                            st              ue                                     3
peer review ..............    .................... 33, 34, 35                tatic testing ..
                                                                            st               ..................................... 13, 32
performa  ance testing .      .......................... 28, 58              tress testing .
                                                                            st               ............................... 28, 58, 60
                                                                                                                                   5
performa  ance testing tool ................... 58, 60                       tress testing tool .............................. 58, 60
                                                                            st                                                     5
          e
pesticide paradox.....        ................................ 14            tructural testi .................... 24, 28, 29, 42
                                                                            st               ing                                   2
          ty
portabilit testing ......     ................................ 28            tructure-base technique ................ 39, 42
                                                                            st              ed                 e                   3
          ffect
probe eff ..............      ................................ 58            tructure-base test design technique .... 42
                                                                            st              ed
           re
procedur .................    ................................ 16            tructure-base testing ..................... 37, 42
                                                                            st              ed                                     3
           risk
product r ..............      .............. 18, 45, 53, 54                  tub             ........................................... 24
                                                                            st ...............
project risk ...............  .................... 12, 45, 53                uccess factors ....................................... 35
                                                                            su
prototyping ...............   ................................ 22            ystem integra
                                                                            sy               ation testing ................. 22, 252
quality 8 10, 11, 13, 19, 28, 37, 38, 47, 48,
          8,                                                                 ystem testing
                                                                            sy                                   25,
                                                                                            g13, 22, 24, 2 26, 27, 49, 69          4
          3,
   50, 53 55, 59                                                             echnical revie .................... 31, 33, 34, 35
                                                                            te              ew                                     3
rapid application dev        velopment (R       RAD) ..... 22                est
                                                                            te analysis ..                                         4
                                                                                             ......................... 15, 38, 48, 49
Rational Unified Proc         cess (RUP) ............. 22                   te approach ........................ 38, 48, 50, 51
                                                                             est                                                   5
recorder ...................
          r                   ................................ 34            est
                                                                            te basis ....... ........................................... 15
regressio testing .... 15, 16, 21, 28, 29, 30
           on                 ..                                            te case . 13, 14, 15, 16, 2 28, 32, 37 38,
                                                                             est                               24,                  7,
Regulation acceptance testing ............... 27                               39, 40, 41, 42, 45, 51, 5 59, 69 55,
reliability ................... 11, 13, 28, 50, 53, 58
          y                   ..                                             est
                                                                            te case spec     cification ................. 37, 38, 55
                                                                                                                                   3
          y
reliability testing .......   ................................ 28            est
                                                                            te cases ......  ........................................... 28
         ment
requirem ..............       ........ 13, 22, 24, 32, 34                    est
                                                                            te closure ....  ............................... 10, 15, 16
requirem ments manag         gement tool .............. 58                   est
                                                                            te condition .   ........................................... 38
requirem ments specific       cation ................ 26, 28                te conditions ........... 13, 1 16, 28, 38, 39
                                                                             est             s                   15,               3
responsi   ibilities ............................. 24, 31, 33                est
                                                                            te control.....                                        4
                                                                                             ............................... 15, 45, 51
           g
re-testing . 29, See co       onfirmation t      testing, See               te coverage .................................... 15, 50
                                                                             est
   confirm mation testing     g                                              est             .                   48,
                                                                            te data ........ 15, 16, 38, 4 58, 60, 62, 63          6
review13 19, 31, 32, 33, 34, 35, 36, 47, 48,
          3,                                                                 est
                                                                            te data preparation tool .................. 58, 60     5
          5,
   53, 55 58, 67, 71                                                         est             ,
                                                                            te design13, 15, 22, 37, 38, 39, 43, 48, 58,           4
review to ................
          ool                 ................................ 58              62
reviewer ...................
          r                   .......................... 33, 34              est
                                                                            te design sp    pecification .......................... 45
                             5,
risk11, 12, 13, 14, 25 26, 29, 30, 38, 44, 45,    ,                          est
                                                                            te design tec    chnique .................. 37, 38, 39 3
   49, 50 51, 53, 54
          0,                                                                 est             ol
                                                                            te design too .................................. 58, 595
risk-base approach ............................... 54
          ed                                                                  est
                                                                            Te Developm      ment Proces ..................... 38
                                                                                                               ss
risk-base testing.....
          ed                  .................... 50, 53, 54                est
                                                                            te effort .................................................. 50
                              .............. 11, 25, 49, 53
risks .........................                                             te environme . 15, 16, 1 24, 26, 48, 51
                                                                             est             ent                 17,               4
risks of uusing tool .....    ................................ 62           te estimatio ....................................... 50
                                                                             est            on
robustne testing....
          ess                 ................................ 24            est
                                                                            te execution                        32,
                                                                                            n13, 15, 16, 3 36, 38, 43 45,           3,
roles ................ 8, 31, 33, 34, 35, 47, 48, 49                           57, 58, 60
root caus ................
           se                 .......................... 10, 11             te execution schedule .......................... 38
                                                                             est            n
scribe ................................................. 33, 34              est            n                    38,
                                                                            te execution tool ..... 16, 3 57, 58, 60, 62           6
scripting language....        .................... 60, 62, 63                est
                                                                            te harness...                        16,
                                                                                             ................... 1 24, 52, 58, 60  5
security .................... 27, 28, 36, 47, 50, 58
                              ..                                             est
                                                                            te implemen                                            3
                                                                                            ntation ..................... 16, 38, 49
        2011
Version 2                                                           Page 77 of 78                                           ar-2011
                                                                                                                        31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q
                                                                                                                 International
      fied Teste
 Certif        er                                                                                              Software Te esting
      ation Level Sy
 Founda            yllabus                                                                                    Qualifications Board
                                                                                                              Q            s


         der
test lead ................   .............. 18, 45, 47, 55                  esting and qu
                                                                           te                uality ................................... 11
         der
test lead tasks.......       ................................ 47            esting princip ............................... 10, 14
                                                                           te                ples
test level. 21, 22, 24, 28, 29, 30, 37, 40, 42,                             estware.........
                                                                           te                                     15,
                                                                                              ................... 1 16, 17, 48, 52  4
   44, 45 48, 49
          5,                                                                ool
                                                                           to support ...                         24,
                                                                                              ................... 2 32, 42, 57, 62  5
                             .............. 15, 16, 43, 60
test log .....................                                              ool              or
                                                                           to support fo manageme of testing and  ent               g
test man nagement .....      .......................... 45, 58                                ........................................... 59
                                                                               tests ..........
test man nagement too ....................... 58, 63
                             ol                                            to support fo performance and moni
                                                                            ool              or                                      itoring 60
test man nager ............  ...................... 8, 47, 53              to support fo static testin ................... 59
                                                                            ool              or                   ng
test mon nitoring .........  .......................... 48, 51             to support fo test execution and logg
                                                                            ool              or                                      ging60
test objeective ....... 13 22, 28, 43, 44, 48, 51
                            3,                                              ool              or
                                                                           to support fo test specif             fication ............ 59
         cle
test orac ................   ................................ 60           to support fo testing ...................... 57, 62
                                                                            ool              or                                     5
test orgaanization ......    ................................ 47            op-down .......
                                                                           to                 ........................................... 25
test plan .. 15, 16, 32 45, 48, 49, 52, 53, 54
         n                  2,                                               aceability .....
                                                                           tra                ............................... 38, 48, 52
                                                                                                                                    4
test plannning ............ 15, 16, 45, 49, 52, 54
                             ..                                              ansaction pro
                                                                           tra                 ocessing seq       quences ......... 25
test plan nning activities ......................... 49                     ypes of test to ................................ 57, 58
                                                                           ty                 ool                                   5
test proccedure .......... 15, 16, 37, 38, 45, 49
                             ..                                              nit
                                                                           un test frame     ework ...................... 24, 58, 605
test proccedure specif       fication .............. 37, 38                  nit
                                                                           un test frame     ework tool ..................... 58, 605
test proggress monitoring ......................... 51                     uppgrades .......  ........................................... 30
         ort
test repo .................  .......................... 45, 51             us                 ............. 11, 2 28, 45, 47, 53
                                                                             sability .........                   27,               4
test repoorting ............ .......................... 45, 51             us                ng                                     2
                                                                             sability testin ................................. 28, 45
          pt
test scrip ................. .................... 16, 32, 38                 se
                                                                           us case test .                                           3
                                                                                              ..................................... 37, 40
test strattegy ............. ................................ 47           us case testing .......................... 37, 40, 41
                                                                             se                                                     4
          e
test suite .................................................. 29             se
                                                                           us cases ......                                          2
                                                                                              ......................... 22, 26, 28, 41
test summmary report .       ........ 15, 16, 45, 48, 51                   us acceptan testing .......................... 27
                                                                             ser             nce
test tool classification .............................. 58
                             n                                             vaalidation .................................................. 22
test type ...................
         e                   ........ 21, 28, 30, 48, 75                   veerification ................................................ 22
          en
test-drive developm        ment ......................... 24               veersion contro ......................................... 52
                                                                                             ol
tester 10 13, 18, 34, 41, 43, 45, 4 48, 52,
         0,                                      47,                       V--model ......... ........................................... 22
   62, 67 7                                                                walkthrough ...    ............................... 31, 33, 34
                                                                                                                                    3
          sks
tester tas ..............    ................................ 48                              t
                                                                           white-box test design tech                               3
                                                                                                                hnique ....... 39, 42
test-first approach ....     ................................ 24                              ting
                                                                           white-box test ............................... 28, 42    2




        2011
Version 2                                                          Page 78 of 78                                              ar-2011
                                                                                                                          31-Ma
             al                 Qualifications Board
© Internationa Software Testing Q

								
To top