; ARC ME Toolkit
Documents
Resources
Learning Center
Upload
Plans & pricing Sign in
Sign Out
Your Federal Quarterly Tax Payments are due April 15th Get Help Now >>

ARC ME Toolkit

VIEWS: 48 PAGES: 53

  • pg 1
									                            M&E Toolkit:
  A Compilation of Monitoring and Evaluation
                              Resources




Prepared by Melissa Fuster, ARC Presidential Intern, for the America Red Cross
               International Programs Technical Solutions Unit



                                August 2006
M&E Toolkit




          2
                                                                                     M&E Toolkit


                                        Introduction

The purpose of this toolkit is to provide a easy way of accessing program monitoring and
evaluation (M&E) resources. It also contains useful resources for program design, but is mostly
focused on M&E.

The toolkit is composed of two sections:

   Section A
    This section includes a total of 26 M&E manuals from different humanitarian and
      development assistance agencies.

   Section B
    This section is composed mainly by non-manual resources from different agencies. It
      contains general M&E resources, methods, and indicators.

The toolkit has two main components: One is this document containing reviews of the different
resources included in the two sections and different matrix organizing the resources for easy
access. The second component is the electronic resource compilation CD, containing the
electronic copies of most of the resources included in this toolkit. The CD also contains a copy of
the Sphere Project handbook (2004 version).




                                                                                                 3
                                                                                                       M&E Toolkit


                                                         INDEX

Introduction
                                                                                                                2
Method Review Matrix
                                                                                                                7
Section A: M&E Manuals                                                                                         14
Introduction                                                                                                   14
M&E Manuals Review Matrix                                                                                      14
Review Section                                                                                                 17
(Africare 2003)
                                                                                                               17
Africare Field Manual on the design, implementation, monitoring and evaluation of food security activities.
(ALNAP 2003a)
                                                                                                               18
Participation by crisis-affected populations in humanitarian action practitioner’s handbook
(ALNAP 2003b)
                                                                                                               20
Training Modules.
(ALNAP 2006)
                                                                                                               20
Evaluating humanitarian action using ECD-DAC Criteria
(CARE 1997)
                                                                                                               21
Care-Uganda. January 1997. Guidelines to Monitoring and Evaluation
(CARE 2002)
                                                                                                               22
CARE Project Design Handbook
(CIVICUS)
                                                                                                               23
World Alliance for Civic Participation (CIVICUS) Monitoring and Evaluation
(CRS 1999)
                                                                                                               24
Rapid Rural Response Appraisal & Participatory Rural Appraisal (RRA/PRA)
(CRS 2004)
                                                                                                               24
ProPack: The Catholic Relief Services (CRS) Project Package..
(FAO 2003)
                                                                                                               25
Food and Agriculture Organization of the United Nations Auto-Evaluation Guidelines
(FOCUS 2000)
                                                                                                               26
A guide to monitoring and evaluating adolescent reproductive health programs. FOCUS on Young Adults.
(IFAD)
                                                                                                               27
A guide for project M&E: Managing for Impact in Rural Development.
(IFRC 2002)
                                                                                                               28
International Federation of the Red Cross Handbook for Monitoring and Evaluation, 1 st edition.
(MEASURE 2005)
                                                                                                               30
A Guide for Monitoring and Evaluating Child Health Programs
(Merci Corps 2003)
                                                                                                               31
Design, Monitoring and Evaluation Guidebook
(SAVE)
                                                                                                               31
How to Mobilize Communities for Health and Social Change. (Ch. 6- Evaluate Together). Save the Children
(SAVE 2006)
                                                                                                               32
Toolkits: A Practical Guide to planning, monitoring, evaluation and impact assessment
(SAVE 2004)
                                                                                                               34
Children in crisis: Good practices in evaluating psychosocial programming
(SFCG 2006)
                                                                                                               35
Designing for Results: Integrating Monitoring and Evaluation in Conflict Transformation Programs
(UNDP 2002)
                                                                                                               36
Handbook on Monitoring and Evaluating for Results.
(UNFPA 2004)
                                                                                                               37
The Programme Manager’s Planning, Monitoring and Evaluation Toolkit.
(UNICEF 1991)
                                                                                                               38
A UNICEF Guide for Monitoring and Evaluation.



                                                                                                                 4
                                                                                                       M&E Toolkit

(WB-IPDET 2001)
                                                                                                                    39
International Program for Development Evaluation Training. The Independent Evaluation Group.
(WB 2004a)
                                                                                                                    39
Monitoring & Evaluation: Some tools, methods & approaches
(WB 2004b)
                                                                                                                    39
Ten steps to a results-based monitoring and evaluation system: A handbook for development practitioners.
(WKKF 1998)
WK Kellogg Foundation Evaluation Handbook.                                                                          40

Section B: Non-Manual Resources                                                                                     42
Introduction                                                                                                        42
Non-Manual Resources Review Matrix                                                                                  42
Review Section                                                                                                      43
(Adrien 2003)
                                                                                                                    43
Guide to conducting reviews of organizations supplying M&E training.
(AEA 2002)
Impact Evaluations When Time and Money are Limited: Lessons from International Development on the Design            43
of rapid and economical, but methodologically sound, impact evaluations
(Aubel 1999)
                                                                                                                    44
Participatory Program Evaluation Manual
 (Bakyaita and Root 2005)
 Building Capacity in Monitoring and Evaluating Roll Back Malaria in Africa: A conceptual framework for the         44
 roll back malaria partnership.
(Bamberger et al 2003)
                                                                                                                    44
 Shoestring Evaluation: Designing Impact Evaluations under Budget, Time and Data Constraints.
(Bergeron et al 2006a)
                                                                                                                    44
Monitoring and Evaluation framework for title II development-oriented projects, USAID Technical Note 10
(Bergeron et al 2006b)
                                                                                                                    44
Evaluating title II development-oriented multi-year assistance projects, USAID Technical Note 11
 (Billing et al 1999)
                                                                                                                    44
Water and Sanitation Indicators Measurement Guide. Food and Nutrition Technical Assistance (FANTA).
 (Bonnard 2002)
                                                                                                                    45
 Title II Evaluations Scopes of Work, Technical Note 2. Food and Nutrition Technical Assistance (FANTA).
(Booth and Lucas 2002)
                                                                                                                    45
 Good Practice in the Development of PRSP Indicators and Monitoring Systems, ODI Working Paper 172
(Boyce and Neale 2006a)
Conducting in-depth interviews: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input.      45
Pathfinder International
(Boyce and Neale 2006b)
                                                                                                                    45
Using Mystery Clients: A Guide to Using Mystery Clients for Evaluation Input. Pathfinder International
(Caplan and Jones 2002)
Caplan, K. and D. Jones. July 2002. Practitioner Note Series: Partnership Indicators, measuring the effectiveness   45
of multi-sector approaches to service provision
(Chapman and Wameyo 2001)
                                                                                                                    45
Monitoring and evaluating advocacy: a scoping study.
(Cramb and Purcell 2001)
Developing forage technologies with smallholder farmers: How to monitor and evaluate impacts. Australian            46
Centre for International Agricultural Research, Impact assessment program: Working paper series #41
(DAC 1999)
                                                                                                                    46
Guidance on evaluating humanitarian assistance in complex emergencies.
(DAC 2000)
                                                                                                                    46
Effective practice in conducting a multi-donor evaluation.
(Edbert and Hoechstetter)
                                                                                                                    46
Mission Possible: Evaluating Advocacy Grants. Foundation News.
(George)
                                                                                                                    46
The Quantification of Impact. Enterprise Development Impact Assessment Information Service.


                                                                                                                     5
                                                                                                  M&E Toolkit

(GNT 2006)
                                                                                                              46
How to work effectively with an evaluation consultant
(HIV Alliance)
                                                                                                              47
NGO Support Toolkit
(IDS 1998)
                                                                                                              47
Participatory monitoring and evaluation: learning from change
(IFRC 2005)
                                                                                                              47
Operational Framework for Evaluations
(Levinson et al 1999)
                                                                                                              47
Monitoring and Evaluation: A Guidebook for Nutrition Project Managers in Developing Countries
 (Magnani 1997)
                                                                                                              48
Sampling guide. Food and Nutrition Technical Assistance (FANTA).
(Mayoux a)
                                                                                                              48
 Qualitative Methods. Enterprise Development Impact Assessment Information Service
(Mayoux b)
                                                                                                              48
Participatory Methods. Enterprise Development Impact Assessment Information Service.
(Mayoux c)
                                                                                                              48
Whom do we talk to? Issues in Sampling. Enterprise Development Impact Assessment Information Service
(Mayoux d)
                                                                                                              48
What do we want to know? Selecting indicators. Enterprise Development Impact Assessment Information Service
(Neale et al 2006)
Preparing a Case Study: A Guide for Designing and Conducting a Case Study for Evaluation Input. Pathfinder    48
International.
(Pisani 2003)
                                                                                                              49
Estimating the size of populations at risk for HIV, issues and methods. UNAIDS/IMPACT/FHI
(Ravallion 2002)
                                                                                                              49
Evaluating Anti-Poverty Programs
 (Riely et al 1999)
Food Security Indicators and Framework for Use in the Monitoring and Evaluation of Food Aid Programs. Food    49
and Nutrition Technical Assistance (FANTA).
 (Swindale and Bilinsky 2005)
 Household Dietary Diversity Score (HDDS) for Measurement of Household Food Access: Indicator Guide.          49
 Washington, D.C.: Food and Nutrition Technical Assistance Project, Academy for Educational Development
 (Swindale and Ohri-Vachaspati 2005)
 Measuring Household Food Consumption: A Technical Guide. Washington, D.C.: Food and Nutrition Technical      49
 Assistance (FANTA) Project, Academy for Educational Development (AED).
(USAID Tips 1996a)
                                                                                                              50
 Conducting a participatory evaluation. Performance Monitoring and Evaluation TIPS.
(USAID Tips 1996b)
                                                                                                              50
Conducting Key Informant Interviews. Performance Monitoring and Evaluation TIPS
(USAID Tips 1996c)
                                                                                                              50
Preparing an Evaluation Scope of Work (SoW). Performance Monitoring and Evaluation TIPS
(USAID Tips 1996d)
                                                                                                              50
Using Direct Observation Techniques. Performance Monitoring and Evaluation TIPS.
(USAID Tips 1996e)
USAID Center for Development Information and Evaluation. 1996. Using Rapid Appraisal Methods. Performance     50
Monitoring and Evaluation TIPS
(USAID Tips 1996f)
                                                                                                              50
 Selecting Performance Indicators. Performance Monitoring and Evaluation TIPS
(USAID Tips 1996g)
                                                                                                              51
Preparing a Performance Monitoring Plan. Performance Monitoring and Evaluation TIPS
(USAID Tips 1996h)
                                                                                                              51
Establishing Performance Targets. Performance Monitoring and Evaluation TIPS
(USAID Tips 1996i)
                                                                                                              51
Conducting Customer Service Assessments. Performance Monitoring and Evaluation TIPS.




                                                                                                               6
                                                                                                       M&E Toolkit

(USAID Tips 1996j)
                                                                                                                  51
Conducting Focus Group Interviews. Performance Monitoring and Evaluation TIPS
(Van de Putte 2001)
                                                                                                                  51
Follow-up to evaluations of humanitarian programmes
(Vermillion 2000)
Guide to Monitoring and Evaluation of Irrigation Management Transfer. The Japanese Institute for irrigation and   51
Drainage (JIID).
(WB)
                                                                                                                  51
Evaluation designs.
(WB 2002)
                                                                                                                  52
Sleeping on our own mats: An introductory guide to community-based Monitoring and Evaluation.
(WV 2003)
                                                                                                                  52
TDI (Transformative Development indicators) Field Guide. World Vision Development Resources Team
(Zeller 2004)
                                                                                                                  52
Review of Poverty Assessment Tools.




                                                                                                                   7
                                                                                                              M&E Toolkit


                                                Method Review Matrix
        The following is a matrix containing the following methods included in this toolkit:

           Assessments                                           Calendars
           Capacity Enhancement Needs Assessment                 Case Studies
           Cognitive Social Capita Assessment                    Cost-benefit Analysis
           Desk Study                                            Diagrams
           Diaries                                               Document Review
           Drawing                                               Field Visits
           Focus Groups                                          Historical profile
           Interviews                                            Mapping
           Media content analysis                                Nominal group technique
           Observation                                           Participatory Methods (in general)
           Photography/video                                     Participatory Rural Appraisal (PRA)
           Proportional pilling                                  Qualitative methods (in general)
           Quantitative methods (in general)                     Questionnaires
           Ranking                                               Rural Rapid Appraisal (RRA)
           Sampling                                              Secondary data (in general)
           Self-reported checklist                               Stakeholder analysis
           Surveys                                               Venn diagram
           Tests                                                 Timelines
           Transect walk                                         Triangulation
           SWOT/T/C (Strengths, weakness, opportunities limitations/threats/constraints) analysis

        The methods are listed in alphabetical order, followed by the resource citation, the location of the resource (Section
        A or B) and a brief comment regarding how much detail the resource presents on the given method presented.


      Manual(s)                     Section       Comment
Assessments
     WKKF 1998                           A        Overview (Psychological health status measure)
     Levinson et al 1999                 B        (Direct measurements) overview
Calendars
     CRS 1999                            A        Introduction and guidance on its use
     ALNAP 2003a                         A        Sample exercise
     Cramb and Purcell 2001              B        Information focused on agriculture
Capacity enhancement needs assessment (CENA)
     SFCG 2006                 A      Brief explanation as a tool for peacekeeping programs
Case Studies
      IFAD                               A        Purpose, how-to, tips
      FAO 2003                           A        Short introduction with definition, advantages and disadvantages.
      Mayoux a                           B        Review
      SFCG 2006                          A        Brief explanation as a tool for peacekeeping programs
      Neale et al 2006                   B        In depth guide
Cognitive Social Capital Assessment (CSA)
     SFCG 2006                    A       Brief explanation as a tool for peacekeeping programs
Cost-benefit (and effectiveness) analysis
     IFAD                          A      Purpose, how-to, tips



                                                                                                                                 8
                                                                                                    M&E Toolkit


      Manual(s)                Section   Comment
      WB 2004a                   A       Definition, use, advantage, disadvantages, and skill and time requirements
                                 A       Description, methods for calculating (cost-effectiveness ratio, non-quantifiable
      SAVE 2006
                                         benefit rating), strengths, weaknesses, and when to use.
Desk Study
      FAO 2003                   A       Detailed information with definition, advantages and disadvantages.
Diagrams
      Cramb and Purcell 2001     B       Information focused on agriculture
                                 A       Description, types (social and resource mapping, time lines, impact diagrams,
      SAVE 2006
                                         spider diagrams), strengths, weaknesses, prerequisites.
Diaries
      WB-IPDET 2001              A       Overview, advantages and disadvantages
      IFAD                       A       Purpose, how-to, and tips
                                 A       Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                         methods)
Document review
     IFAD                        A       Purpose, how-to, and tips
     IFRC 2002                   A       Basic information
     MEASURE 2005                A       Brief introduction
                                 A       In a table, compared with other methods in terms of cost, training, time
      WB 2004b
                                         requirements and response rate.
                                 A       Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                         methods)
      WV 2003                     B      Guidance
Drawing
                                 A       Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                         methods)
Field Visit
       IFRC 2002                 A       How-to, Provides IFRC protocol for visits.
       UNDP 2002                 A       Brief discussion
Focus Groups
                                 A       In reference table, with objective, advantages, disadvantages, limitations and
      ALNAP 2003a
                                         constraints.
      CIVICUS                    A       Description, advantages and disadvantages
      IFAD                       A       Purpose, how-to, tips
      IFRC 2002                  A       Overview.
      FAO 2003                   A       Detailed discussion with definition, advantages and disadvantages.
      FOCUS 2000                 A       Guide for school adolescents focus groups
      SAVE 2004                  A       Definitions and examples (related to psychosocial programming)
      UNFPA 2004                 A       Brief overview
      WB-IPDET 2001              A       Overview and tips for conducting a focus group.
                                 A       Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                         methods)
      USAID Tips 1996j            B      Introduction and guidance
      Levinson et al 1999         B      overview
      WV 2003                     B      Guidance
Historical profile
      Africare 2003              A       Detailed
Interviews
       Africare 2003             A       (Semi-structured) Detailed



                                                                                                                   9
                                                                                                    M&E Toolkit


      Manual(s)                Section   Comment
                                 A       (Semi-structured) In reference table, with objective, advantages, disadvantages,
      ALNAP 2003a
                                         limitations and constraints.
      CIVICUS                    A       Description, advantages and disadvantages
      CRS 1999                   A       Introduction and guidance on use
      CARE 1997                  A       (Semi-structured and qualitative) Detailed explanation/how-to.
      IFAD                       A       (semi-structured) Purpose, how-to, tips
                                 A       (key informant) Overview
      IFRC 2002
                                         (Community) Overview
                                 A       (Semi-structured) Detailed discussion with definition, advantages and
      FAO 2003
                                         disadvantages.
      MEASURE 2005               A       Brief introduction
      UNFPA 2004                 A       (Key informant and community) Brief overview
                                 A       In a table, compared with other methods in terms of cost, training, time
      WB 2004b
                                         requirements and response rate.
      WKKF 1998                  A       Overview
                                 A       Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006                          methods)
                                         (activity interview)- tool for peacebuilding programs
      Boyce and Neale 2006a       B      Step by step guidance
      Cramb and Purcell 2001      B      Information focused on agriculture
      Mayoux a                    B      Review
      USAID Tips 1996b            B      Introduction and guidance
      Levinson et al 1999         B      overview
Mapping
     Africare 2003               A       (Participatory) Detailed, with examples
     CARE 1997                   A       Information on social and historical mapping.
     ALNAP 2003a                 A       Short explanation and sample exercise
                                 A       Introduction and guidance on use.
      CRS 1999                           Short description of its variations (territorial, regional, family, resource,
                                         historical, interest group and social maps)
      IFAD                       A       Purpose, how-to, tips (also has a section on GIS mapping)
                                 A       (Open-ended, key informant, unstructured , semi-structured, self-report)
      SAVE 2004
                                         Definitions and examples, related to psychosocial programming.
      Cramb and Purcell 2001     B       Information focused on agriculture
                                 A       (conflict) Explanation, table with strengths, weaknesses and cost (compared
      SFCG 2006
                                         with other methods)
Media content analysis
     SFCG 2006                   A       Brief explanation as a tool for peacekeeping programs
Mystery clients
     FOCUS 2000                  A       Questionnaire for debriefing mystery clients
     Boyce and Neale 2006b       B       Step-by-step guidance
Nominal Group technique
     UNFPA 2004                  A       Brief overview
     WKKF 1998                   A       Overview
Observation
     IFAD                        A       (Direct Observation) purpose, how-to, tips
     IFRC 2002                   A       Basic information
     CIVICUS                     A       (Participant) Description, advantages and disadvantages
                                 A
      MEASURE 2005               A       Brief introduction
      SAVE 2004                  A       (Participatory, systematic, direct observation) Definitions and examples


                                                                                                                  10
                                                                                                    M&E Toolkit


      Manual(s)                Section   Comment
                                         (related to psychosocial programming)
      UNFPA 2004                   A     Brief overview
                                   A     Overview, advantages and disadvantages, types of observer (unobtrusive,
      WB-IPDET 2001
                                         participant, obtrusive)
                                   A     In a table, compared with other methods in terms of cost, training, time
      WB 2004b
                                         requirements and response rate.
      WKKF 1998                    A     Overview
                                   A     (direct) Explanation, table with strengths, weaknesses and cost (compared with
      SFCG 2006
                                         other methods)
      Mayoux a                     B     (direct) Review
      USAID Tips 1996d             B     Introduction and guidance
      Levinson et al 1999          B     overview
Participatory methods (in general)
       IFRC 2002                   A     Introduction
       Africare 2003               A     Introduction and methods
       CRS 1999                    A     Introduction and contrast with non-participatory methods.
       ALNAP 2003a                 A     How-to guidance to facilitate participation
       WB 2004a                    A     Definition, use, advantage, disadvantages, and skill and time requirements
       SAVE 2006                   A     Introduction
       Mayoux b                    B     Introduction, including challenges, advantages and techniques.
       USAID Tips 1996a            B     Introduction and guidance
       Aubel 1999                  B     In-depth guide
Photography/video
                                   A     Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                         methods)
      SAVE 2006                    A     Video: description, uses, weaknesses, strengths, prerequisites
PRA (Participatory Rural Appraisal)
      Africare 2003                 A    Detailed explanation/how-to, contrasted with RRA
                                    A    Detailed explanation/how-to of the different methods used (mapping, rapid
      CARE 1997
                                         social organization profile, group brainstorming, ranking)
      CARE 2002                    A     Very brief explanation
      CRS 1999                     A     Detailed how-to and tools
      IFRC 2002                    A     Overview of methods
      SAVE 2004                    A     Definitions and examples (related to psychosocial programming)
      WB 2004a                     A     Definition, use, advantage, disadvantages, and skill and time requirements
                                   A     (Participatory Learning and Action) Description; features; methods used
                                         (secondary sources, direct observation, interviews, focus groups, oral history,
      SAVE 2006                          ranking and scoring, diagrams and maps, techniques for children); analyzing
                                         the results; variants (rapid rural appraisal, rapid assessment procedures);
                                         strengths and weaknesses; prerequisites
      USAID Tips 1996e             B     Introduction and guidance
Proportional pilling
     ALNAP 2003a                   A     Short explanation
Qualitative methods (in general)
                                   A     Introduction and contrast with quantitative data.
      CARE 1997                          Information on analyzing this type of data (raw data, simple description,
                                         interpretation)
      CRS 1999                     A     Comparison with quantitative methods, different uses.
      IFAD                         A     Introduction and some information on analysis
      UNFPA 2004                   A     Brief overview of analysis
      WB-IPDET 2001                A     Data analysis information


                                                                                                                  11
                                                                                                    M&E Toolkit


      Manual(s)                Section   Comment
      SAVE 2006                     A    Introduction
      Mayoux a                      B    Review, how it complements quantitative and participatory methods.
      Levinson et al 1999           B    overview
Quantitative methods (In general)
                                    A    Introduction and contrast with qualitative data.
      CARE 1998                          Information on analyzing this type of data (raw data, simple description,
                                         interpretation)
      CRS 1999                      A    Comparison with qualitative methods, different uses.
      IFAD                          A    Introduction and some information on analysis
      UNFPA 2004                    A    Brief overview of analysis
      WB-IPDET 2001                 A    Data Analysis information
      SAVE 2006                     A    Introduction
      Ravallion 2002                B    Review of experimental and non-experimental methods for analysis
      Levinson et al 1999           B    Overview
Questionnaires
                                    A    Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                         methods)
Ranking
      Africare 2003                 A    Detailed
      CARE 2002                     A    (Wealth) Example and guidance
      CARE 1997                     A    Detailed explanation/how-to
      CRS 1999                      A    Introduction and guidance on its use
                                    A    (simple ranking) Purpose, how-to, tips
      IFAD
                                         Information on ranking and prioritizing methods.
                                    B    (preference, pairwise, wealth and well-being) Information focused on
      Cramb and Purcell 2001
                                         agriculture
                                    A    (pairwise) Explanation, table with strengths, weaknesses and cost (compared
      SFCG 2006
                                         with other methods)
Role playing
                                    A    Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                         methods)
RRA (Rural Rapid Appraisal)
      Africare 2003                 A    Detailed explanation/how-to, contrasted with PRA
      CRS 1999                      A    Detailed how-to and tools
      IFRC 2002                     A    Overview of methods
      UNFPA 2004                    A    Brief overview
      WB 2004a                      A    Definition, use, advantage, disadvantages, and skill and time requirements
      USAID Tips 1996e              B    Introduction and guidance
Sampling
     IFAD                           A    Random and non-random methods
     CARE 1997                      A    Probability and purposeful methods
                                    A    Very good source of information. Includes a sample size table and a focus on
      FOCUS 2000
                                         cluster sampling.
      WB-IPDET 2001                 A    Introduction to the different types, table with Minimum Sample Size.
      SAVE 2006                     B    Review of the different types of sampling
                                    B    Definition of different methods, and information about sampling for
      Mayoux c
                                         quantitative, qualitative and participatory methods.
      Magnani 1997                  B    Detailed guidance
Secondary data (In General)
     CARE 1997                      A    Review and analysis for this type of data



                                                                                                                 12
                                                                                                       M&E Toolkit


       Manual(s)                 Section      Comment
       IFRC 2002                     A        Information on its use.
                                     A        Explanation, table with strengths, weaknesses and cost (compared with other
       SFCG 2006
                                              methods)
       Levinson et al 1999           B        Overview
Self-reported checklist
      WB-IPDET 2001                  A        Overview, advantages and disadvantages
Stakeholder analysis
      IFAD                           A        Purpose, how-to, tips
      CARE 2002                      A        Brief explanation
Surveys
                                     A        (Baseline) In reference table, with objective, advantages, disadvantages,
       ALNAP 2003a
                                              limitations and constraints.
       CARE 1997                     A        Includes short how-to design questions and information on rapid surveys.
       CRS 2004                      A        Only contains a ―baseline survey worksheet‖
       IFAD                          A        Purpose, how-to, tips
                                     A        Basic information.
       IFRC 2002                              Detailed information concerning baseline surveys (including data analysis and
                                              reporting)
                                     A        Detailed discussion with definition, advantages and disadvantages. Includes
       FAO 2003                               advantages and disadvantages of common methods (face-to-face, telephone,
                                              forms, mail surveys, email and web surveys)
                                     A        Introduction to the types of Population-based surveys used for child health
                                              programs: (USAID Demographic and Health Survey, or DHS; UNICEF
       MEASURE 2005
                                              Multiple Indicator Survey, or MICS; 30-Cluster Survey; Rapid Core
                                              Assessment Tool on Child Health, or CATCH);
       UNFPA 2004                    A        Brief overview
                                     A        Overview (self-administered and in-person), general guidelines for conducting
       WB-IPDET 2001
                                              a survey and developing a questionnaire
       WV 2003                       B        Guidance
       WB 2004a                      A        Definition, use, advantage, disadvantages, and skill and time requirements
                                     A        (Self-administered questionnaires) In a table, compared with other methods in
       WB 2004b
                                              terms of cost, training, time requirements and response rate.
       WKKF 1998                     A        (written questionnaires) Overview
                                     A        Explanation, table with strengths, weaknesses and cost (compared with other
       SFCG 2006
                                              methods)
                                     B        Information on different types of surveys: ; Living Standard Measurement
                                              Surveys (LSMS); the Social Dimensions of Adjustment Integrated Surveys
       Zeller 2004                            (SDA-IS); Social Dimensions of Adjustment Priority Survey (SDA-PS); Core
                                              Welfare Indicators Questionnaire (CWIQ); Demographic and Health Survey
                                              (DHS).
                                     A        Description, design, data collection and analysis, strengths, weaknesses,
       SAVE 2006
                                              prerequisites and when they should be used.
       Levinson et al 1999           B        Overview
SWOL/T/C (Strengths, weaknesses, opportunities, limitation/threats/constraints analysis)
     CARE 1997                 A       Detailed explanation/how-to
     IFAD                      A       Purpose, how-to, tips
     CARE 2002                 A       Brief explanation
                               A       Detailed discussion with definition, advantages and disadvantages. Includes a
     FAO 2003
                                       sample matrix.
     SAVE 2006                 A       Brief description
Test



                                                                                                                    13
                                                                                         M&E Toolkit


      Manual(s)       Section   Comment
      WKKF 1998         A       Overview of knowledge or achievement tests
                        A       Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                methods)
Timelines
      ALNAP 2003a       A       (Historical) Sample exercise
Transect walk
      Africare 2003     A       Detailed, with example
      ALNAP 2003a       A       Example
      CRS 1999          A       Introduction and guidance on use
      IFAD              A       Purpose, how-to, tips
Triangulation
      Africare 2003     A       Detailed
      CARE 1997         A       Introduction
      CRS 1999          A       How-to guidance
      IFRC 2002         A       Overview, as part of PRA.
      FAO 2003          A       Brief mention
      SAVE 2004         A       Definitions and examples (related to psychosocial programming)
      SFCG 2006         A       Mention
Venn Diagram
     Africare 2003      A       Detailed, with examples and variations
     ALNAP 2003a        A       Short explanation
                        A       Introduction and guidance on use.
      CRS 1999
                                Description on one variation: polarization diagram.
                        A       Explanation, table with strengths, weaknesses and cost (compared with other
      SFCG 2006
                                methods)




                                                                                                      14
                                                                                                      M&E Toolkit



                                                   Section A:
                                                M&E Manuals

                                                  Introduction
The first section of the toolkit contains a short review of the content of several manuals used by agencies to guide
their DM&E process. The manuals are ordered alphabetically. For each manual, the following information is
included:

        Electronic location (if available)
        File location of the resource (if available)
        Keywords and/or sector related to the manual
        A reviewer’s synopsis including a short comment about the resource content and usefulness
        Detailed content, according to the manual’s index
        List of annexes (if available)

The following matrix provides a review of the manuals contained in this section. The matrix includes the citation of
the manual, the sector and the methods or tools the manual presents.

                                      M&E Manuals: Review Matrix

   Manual          Sector                         Methods/Tools
                                                     RRA and PRA; Triangulation; Participatory mapping
                      Food security                 Transect walk; Venn Diagram; Calendars;
 Africare 2003
                      Participatory M&E             Wealth and food security ranking; Historical profile; Matrices
                                                     Semi-structured interviews
                      Participatory M&E
                                                     Timelines
                      Food security
                                                     Mapping
                      Water and sanitation
ALNAP 2003a                                          Ranking
                      Health
                                                     Calendars
                      Education
                                                     Transect walk
                      Habitat and shelter.
                      Humanitarian programs
ALNAP 2003b                                       No tools
                      Training
 ALNAP 2006           Humanitarian programs        General guidelines for methods, but no methods included
                                                    Sampling
                                                    In general: secondary and primary data sources; qualitative and
                                                  quantitative methods; participatory and non-participatory; verbal and
                                                  less verbal approaches.
                                                   Data analysis
                                                   PRA toolkit: social mapping, historical mapping, rapid social
                                                  organization profile, group brainstorming, ranking exercises
  CARE 1997           General M&E
                                                   Focus group discussions
                                                   Semi-structured interviews
                                                   Qualitative interviews
                                                   Rapid surveys
                                                   Question design for surveys and interviews;)
                                                   Strengths, weaknesses, opportunities, limitations / threats
                                                  (SWOL/T)]




                                                                                                                       15
                                                                                              M&E Toolkit


Manual      Sector                      Methods/Tools
                                         PRA
               General M&E              Ranking
CARE 2002
               Program Design           Stakeholder analysis
                                         SWOT analysis.
                                         Interviews, key informant interviews
                                         Questionnaires
                                         Focus groups
                                         Community meetings
             General M&E
CIVICUS                                  Fieldworker reports
            External Evaluations
                                         Ranking
                                         Participant observation
                                         Other methods: visual/audio stimuli, rating scales, critical
                                        event/incident analysis, self-drawings.
                                         PRA and RRA methods: semi-structured interviews; participatory
                                        mapping (with a short description of its variations: territorial, regional,
CRS 1999       Methods                 family resource, historical, interest group and social maps); transect
                                        walk; Venn diagram (and the variation called ―polarization‖ diagram);
                                        calendars; wealth ranking; historical profile; and matrices.
               Program Design
CRS 2004                                No data collection tools
               General M&E
                                         Baseline surveys
                                         Random and non-random sampling
                                         Stakeholder analysis
                                         document review
                                         direct observation
                                         cost-benefit analysis
                                         questionnaires and surveys
                                         semi-structured interviews
               General M&E,             case studies
               capacity building        Methods for groups: brainstorming, focus groups, simple ranking,
  IFAD
               participatory M&E       SWOT, dreams realized or visionary, drama and role plays.
               training                 methods for spatially-distributed information: mapping, transects,
                                        GSI mapping, photographs and video
                                         methods for time based patterns of change: diaries, historical trends
                                        and timelines, seasonal calendars, most significant change
                                         methods for analyzing linkages and relationships- mind maps,
                                        impact flow diagram, institutional linkage diagram, problem and
                                        objectives trees, M&E wheel, systems diagram
                                         methods for ranking and prioritizing- social mapping, matrix
                                        scoring, relative scales or ladders, ranking and pocket charts
                                         Rapid appraisals methods (key informant interviews, focus groups,
                                        community interviews, direct observation, mini-surveys, visualization
                                        tools)
                                         Participatory appraisals concepts
               General M&E
                                         Triangulation.
IFRC 2002      Humanitarian programs
                                         Community inventory
               Development programs
                                         Sample survey
                                         Beneficiary Contact Monitoring (BCM)
                                         Field visits
                                         Secondary data use




                                                                                                            16
                                                                                            M&E Toolkit


 Manual       Sector                      Methods/Tools
                                           SWOT
                                           Questionnaires and survey
                                           Semi-structured interviews/focus groups
 FAO 2003        Auto-Evaluation          country case study
                                           web statistics
                                          expert panels
                                           Short mention of triangulation
                                           Sampling and sample size
                                           Quantitative and qualitative data
                 General M&E
FOCUS 2000                                 Methods for health settings: observation, mystery clients,
                 Health
                                          community questionnaire, youth survey, focus groups (among others
                                          related to adolescent reproductive health)
                                           Introduces the different types of surveys used in child health
                                          programs
MEASURE
                 Child Health             In-depth interviews
  2005
                                           Observation
                                           Document review
Merci Corps      Program design
                                          No methods
   2003          General M&E
                 Participatory M&E
  SAVE                                    No methods
                 Design
                                           Interviews (open-ended or focus groups; key informant;
                                          unstructured or semi-structured, structured, self-report),
                                           Ethnographic techniques (participatory and systematic
                 General M&E
                                          observations; participatory appraisal)
SAVE 2004        Psychosocial
                                           Direct observation techniques (with common ways of recording:
                 Humanitarian programs
                                          narratives; event records; interval recording),
                                           Back translation of scales
                                           Triangulation.
                                           Sampling
               General M&E
                                           Qualitative, quantitative and participatory methods
               Development,
                                           Participatory Learning and Action (PLA), surveys, Logical
SAVE 2006     Emergency and Advocacy
                                          Framework Analysis (LFA), Cost-effectiveness (and cost-benefit)
              Programs
                                          Analysis, Strengths, Weaknesses, Opportunities and Constraints
               External consultant
                                          (SWOC) Analysis, programme visits.
                                           Direct observation
                                           Interviews
                                           focus groups
                                           participant diaries
                                           photography/video
                                           project document review
                                           Questionnaire
                 Peacekeeping
                                           secondary data review;
SFCG 2006        General M&E
                                           Survey
                 External evaluators
                                           Testing
                                           Participatory learning and action techniques or PLA (Venn
                                          Diagrams, Pairwise ranking, conflict mapping, drawing, role playing.
                                           Tools unique to peacebuilding programs: activity interview,
                                          cognitive social capital assessment tool (CSA), media content analysis
                                          tool, case study, capacity enhancement needs assessment (CENA),
                                          Four levels of training evaluation.
                 General M&E              Monitoring tools: field visits, annual project report, outcome
UNDP 2002
                 Joint evaluation        groups, and annual review




                                                                                                          17
                                                                                                     M&E Toolkit


   Manual          Sector                        Methods/Tools
                                                   Qualitative and quantitative analysis
                                                   Rapid appraisal
                      General M&E
                                                   Observation
 UNFPA 2004           Participatory M&E
                                                   Surveys
                      Maternal health
                                                   Key informant interviews
                                                   Focus groups, community interviews, and nominal group technique
UNICEF 1991           General M&E               No methods
                                                  Surveys
                                                  Focus groups
  WB-IPDET                                        Interviews
                      General M&E
    2001                                          Diaries
                                                  Checklists
                                                  Sampling
                                                  Formal surveys
                                                  Participatory methods
  WB 2004a            General M&E
                                                  Rapid appraisal methods
                                                  Cost-Benefit and Cost-Effectiveness Analysis
                                                  Review of program records
                                                  Self-administered questionnaires
  WB 2004b            General M&E
                                                  Interview
                                                  Rating by trained observer
                                                  Observation
                                                  Interviewing and group interviews (nominal group/Delphi
                                                 techniques)
 WKKF 1998            General M&E                Written questionnaires
                                                  Tests and assessments (physiological health status measure,
                                                 knowledge or achievement test)
                                                  Document review



                                            Review Section 
                                                               A
(Africare 2003)
Gervais, S., JC Bryson, K Schoonmaker Freudenberger. 2003. Africare Field Manual on the design,
implementation, monitoring and evaluation of food security activities. 256p.
Web Location: http://www.foodaid.org/pdfdocs/foodsecurity/Africare_Field_Manual_Complete.pdf
Location:  Section A > Africare 2003
Keywords, sector(s): Food security, Participatory M&E
                                                    Reviewer’s Synopsis
The manual is focused on the issue of participation and food/nutrition security, directed mostly at Africare
personnel. Its principal audience is hence the program administrators, technical advisors, and field workers who
intervene directly at the village level. It is a great source of information to food security issues in programming.
                                                           Content
Module 1: Africare’s Approach to Food Security Programs
      o Africare’s participatory approach to Food Security programs (also referred to as Development Assistance
          Programs, or DAP) and the importance and how to approach M&E in their programs.
Module 2: Time Line of the Steps in Development Assistance Program (DAP) Design and Implementation
     o Chronological sequence of activities to be carried out by the program in regards to collecting and utilizing
         information (presents this activities in a timeframe relevant to Africare): DAP design and preparation of the
         DAP document; Submitting the proposal for financing and security host country approval; Team
         building/refinement of the food security framework and development of an understanding of the strategic




                                                                                                                  18
                                                                                                  M&E Toolkit

         framework; The baseline survey; Drawing up the M&E plan and detailed implementation plan; The mid-
         term evaluation; Continued DAP implementation; The impact evaluation
Module 3: Food and Nutrition Security Concepts
     o Food security concepts with examples and figures.
Module 4: Participation
     o Importance of participation in development programs, the different ways of looking at participation, how to
         select a community to be engaged in participation, and staff skills needed to facilitate community
         participation.
     o Participatory Information Management, including a brief mention of methods to be used.
Module 5: An Introduction to the Development of Information Systems and Their Use in the Management of
Africare DAPs
     o The DAP information system (including the M&E system) and information users and their needs (mostly
         specific to Africare).
     o Comparison between monitoring, midterm and final evaluations, including definitions, objectives,
         information users, focus, information needs, and frequency, among other factors.
     o Properties of the information and of the indicators, including definition and types of indicators (direct,
         composite and proxy)
Module 6: How to Develop a Food Security Framework and a Strategic Framework For Your
DAP
     o Food and nutrition security framework and the UNICEF framework, and how to develop one and use it to
         develop a baseline.
Module 7: Rapid And Participatory Rural Appraisal (RRA And PRA) and Their Application to Africare’s Food
Security DAPs
     o Comprehensive view of Rapid Rural Appraisal (RRA) and Participatory Rural Appraisal (PRA), the
         differences between the two as well as their application in the field.
     o Detailed information on the methods for conducting the appraisals including triangulation (taking in
         consideration the possible sources of bias and how to reduce it); participatory mapping (including
         examples); transect walk (including what types of issues can be covered with it and examples); Venn
         Diagram (including what types of issues can be covered with it, examples and variations); calendars (with
         examples); wealth and food security ranking; historical profile; matrices (with examples); semi-structured
         interviews.
     o Guidance on selecting the RRA and PRA teams, setting objectives of the study, carrying out the study,
         reporting.
     o Table illustrating the use of RRA tools to collect types of information needed in baseline.
Module 8: Practical Guidance for Implementing a DAP Information System
     o How to assemble and utilize information while preparing the DAP, including using secondary data (mostly
         related to Africare) and primary data sources.
     o Preparing an M&E system, including indicator development, and the establishment of a community-based
         information system.
     o How to develop capacity-building indicators with the population and how to integrate the information from
         different sources and approaches into the program’s monitoring and evaluation system.
     o Appendix with ―Suggested rankings for each indicator of Africare food security community capacity
         index‖.
     o The following document is included (P.206): Gervaris, S. September 20, 2003. Local capacity building in
         Title II food security projects: A framework. Food Aid Management. The document includes a section
         concerning M&E of local capacity building (P.230)
Annexes: Conceptual framework for understanding food insecurity; Monitoring and Evaluation processes (focused
on capacity building); Building an index for the measurement of local capacity building for food security; Case
Study: Africare’s food security community capacity index (FSCCI); Resources on useful approaches and techniques
for designing and implementing capacity building activities in food secutiry projects.

(ALNAP 2003a)
Active Learning Network for Accountability and Performance in Humanitarian Action (ALNAP). 2003.
Participation by crisis-affected populations in humanitarian action practitioner’s handbook. Overseas
Development Institute. 352p.
Web Location:


                                                                                                                19
                                                                                                      M&E Toolkit

http://www.reliefweb.int/rw/lib.nsf/db900SID/LHON-5TNH95/$FILE/alnap_civilians_2003.pdf?OpenElement
Location:  Section A > ALNAP 2003a
Keyword, sector(s): Participatory M&E, food security, water and sanitation, health programmes, education, habitat
and shelter.
                                                    Reviewer’s Synopsis
This manual provides guidelines for humanitarian agencies seeking to increase participation of people affected by
crises in their assistance programmes. Intended as a guide for international humanitarian personnel as well as staff
working for national and local organizations, the handbook is based on a comprehensive programme of research in
five different humanitarian emergency situations - Afghanistan, Angola, the DRC, Columbia and Sri Lanka, as well
as a literature review.
 This guide provides good exercises to do in the field and contains questions to reflect on the lessons learned through
the different sections. It also contains examples from the field. Part 3 of the guide provides more sector-specific
guidance.
                                                          Content
Introduction: Participation Of Crisis-affected Populations in Humanitarian Action
      o Defines participation and its approaches (instrumental, collaborative, supportive).
      o Types of participation in humanitarian action.
      o Information on developing a strategy for participation.
Part 1 Designing a Strategy for Participation in Humanitarian Action
Chapter 1: Factors Affecting Participation in Humanitarian Action
     o Guidance to develop a strategy taking into considerations factors affecting the humanitarian participatory
          operation.
     o Staff qualities to allow for participation
Chapter 2: The Communication Imperative
     o Communication as a ―two-way process‖ with guidance as to how to facilitate it, including formal and
          informal tools.
     o Guidance on conducting a focus group.
Part 2 Participation throughout the Project Cycle
Chapter 3 Assessment
      o Cross-cutting issues to take into consideration during a participatory assessment: security and protection;
           discrimination and minorities; impartiality and independence.
      o Guidance to understanding the situation context in terms of history (including a ―historical timeline‖
           exercise); the geography and the environment (with a ―mapping exercise‖); society and the economy
           (including short explanation of the following tools: wealth and vulnerability ranking, with sample exercise;
           identification of the population resource basket; identification of the pillars of survival with sample
           exercise; analysis of production processes, with sample exercise; proportional piling, with sample
           exercise); time (with a seasonal activity calendar exercise and daily schedule).
      o Exercises to facilitate an understanding of the crisis and its effects: Analysing the crisis; comparison maps
           (before and after the crisis); the transect walk.
      o Guidance to identifying stakeholders and ―who is who‖ in the community, including the Venn diagram,
           and a proximity-distance analysis exercise.
Chapter 4 Design
      o Like in the previous chapter, it highlights the cross-cutting issues to take into consideration during
           participatory design.
      o Presents participatory tools for program design, such as the problem/solution tree, among others.
Chapter 5 Implementation
      o Like in the previous chapter, it highlights the cross-cutting issues to take into consideration during
           participatory design.
      o Presents participatory tools for program implementation.
Chapter 6 Monitoring
      o Like in the previous chapter, it highlights the cross-cutting issues to take into consideration during
           participatory design.
     o Different approaches to participatory monitoring (instrumental, collaborative and supportive) and its key
          principles.
     o Guidance for designing and implementing a participatory monitoring system




                                                                                                                   20
                                                                                                  M&E Toolkit

    o    Table with the tools available for participatory monitoring, with objectives, advantages, limitations and
         constraints. The following are included: focus groups, roundtables and meetings; individual interviews;
         surveys; mechanisms to protect anonymity; monitoring days; feedback mechanisms.
     o Key factors for ―successful participatory monitoring.‖
Chapter 7 Evaluation
      o Like in the previous chapter, it highlights the cross-cutting issues to take into consideration during
          participatory design.
     o Different approaches to participatory evaluation (instrumental, collaborative and supportive) and its key
         principles.
     o Discusses the following steps during the evaluation process:
         1. Designing the evaluation process, including elements for the ToR.
         2. Implementing the evaluation, with a table discussing the objectives, advantages, limitations and
         constraints of the following tools: focus groups, roundtables and meetings; individual meetings; surveys;
         mechanisms to protect anonymity; evaluation days; social audit by an external evaluator; feedback
         mechanisms.
     o Key factors for ―successful participatory monitoring.‖
Part 3 Sector-related Issues
Chapter 8 Participation and Food Security
     o Food security concepts, including copying mechanisms, and participation in food distribution, nutrition
         programmes, and agricultural rehabilitation.
     o Short discussion regarding M&E in participatory food distribution, nutrition programmes, and agricultural
         rehabilitation.
Chapter 9 Participation and Water/Sanitation Programmes
     o Participatory WatSan assessments (including local skills and stakeholder identification), design, and
         implementation.
     o Short section on WatSan participatory M&E
Chapter 10 Participatory Habitat and Shelter Programmes
     o Settlements and shelters and program participatory assessment, design and implementation.
     o M&E issues specific to the area.
Chapter 11 Participation and Health Programmes
     o Health program participatory assessment, design and implementation.
     o M&E issues specific to the area.
Chapter 12 Participation and Education
     o Education programs and participatory assessment, design and implementation.
     o M&E issues specific to the area

(ALNAP 2003b)
Active Learning Network for Accountability and Performance in Humanitarian Action (ALNAP). 2003.
Training Modules.
Web Location: http://www.alnap.org/resources/training.htm
Location:  Section A >  ALNAP 2003b (NOTE: Only contains sessions for each module. Go to website to
download handouts and trainer tips)
Keywords, Sectors: Humanitarian Programs, Training
                                              Reviewer’s Synopsis
The link above contain the modules, each to be downloaded individually either in MS Word or Adobe. The modules
contain the sessions and hand-outs. The link also contains training tips. Each module contains suggested uses,
sessions and key messages.
                                                     Content
Module 1: Introduction to evaluation of humanitarian action
Module 2: Evaluation of humanitarian action- the evaluator’s role
Module 3: managing and facilitating evaluations in humanitarian action

(ALNAP 2006)
Beck, T. 2006. Evaluating humanitarian action using ECD-DAC Criteria. Active Learning Network for
Accountability and Performance in Humanitarian Action (ALNAP). 80p.



                                                                                                               21
                                                                                                       M&E Toolkit

Web location: http://www.alnap.org/publications/eha_dac/pdfs/eha_2006.pdf
Location:  Section A > ALNAP 2006
Keywords, sectors: Humanitarian programs
                                               Reviewer’s Synopsis
The guide provides support on how to use DAC (Development Assistance Committee) evaluation criteria for
humanitarian action, based on other agencies’ best practices, but does not provide significant details concerning
methodology or evaluation approaches. The guide contains examples of ―good practice‖ for each of the DAC
criteria.
                                                      Content:
1 Purpose and use of this guide
     o See synopsis
     o Definition of the DAC criteria
2 Thinking about EHA (evaluation of humanitarian action)
     o Key themes and issues current in EHA, particularly in relation to lesson-learning and accountability, and
          evaluation use, to provide general context, including definition of EHA and how it is different to other
          types of evaluations.
3 Working with the DAC (development assistance committee) criteria
     o Definition, explanation of the definition, issues to consider, key messages, and two examples of good
          practice intended to provide pointers and suggestions on how to use each of the seven DAC criteria
          (relevance/appropriateness, connectedness, coherence, coverage, efficiency, effectiveness, impact).
4 Methods for the evaluation of humanitarian action: pointers for good practice
     o One page guidelines for good practice in methods for the evaluation of humanitarian action.
Annexes: Joint Committee on Standards for Educational Evaluation; Checklist of themes to be covered in EHA

                                                              C
 (CARE 1997)
Care-Uganda. January 1997. Guidelines to Monitoring and Evaluation. 153p.
Tom Barton, CRC.
Web Location: http://www.care.ca/libraries/dme/CARE%20Documents%20PDF/M&E%20Guidelines%20(C-
Uganda)/M&E%20Guidelines%20(CARE-Uganda)%20--%20English.pdf
Location:  Section A > CARE 1997
Keywords, sectors: General M&E
                                                     Reviewer’s Synopsis
Its intended audience is CARE staff, but could be used other development agency. Although the manual’s first
chapters relate to CARE program, it contains quick reference tables applicable to any program, especially during the
M&E plan design. The guide provides general information on study design and data collection that does not require
a statistical background to understand. More guidance is needed for quantitative studies. Annex 4 contains
information regarding data collection methods (see below).
                                                           Content
Chapter 1: Why is information important to projects?
     o Overview regarding the ―project information system‖
Chapter 2: Who needs information about CARE projects? What information is needed and why?
     o Key categories of people and organizations who may be interested in obtaining information.
     o Addresses the concern on the quality of the information and its use.
     o Needs and concerns of their ―six key information users‖ (Community, Local organizations, Government,
     Project staff, CARE Country Office, CARE International) in a table format.
Chapter 3. What are some of the common strengths and weaknesses of information gathering, analysis, and use in
development projects?
     o General information mostly for CARE programs.
Chapter 4. What key concepts are fundamental to understanding and planning for information management?
     o Project cycles, information needs, the logical framework (log frame).
     o Good, concise table titled: ―Hierarchy of Objectives‖.
     o The structure of M&E Information system for each goal, including the different types of monitoring
          activities (Institutional, context, results and objectives monitoring) with a comprehensive table linking them
          to the objectives; and the different types of evaluations (Baseline study, annual review, mid-term
          evaluation, final evaluation, ex-post evaluation), with a table linking them to the different program stages.


                                                                                                                    22
                                                                                                  M&E Toolkit

     o Design issues.
Chapter 5. What needs to be included in project planning in order to have the desired information available at the
right time in usable form?
     o Information regarding M&E planning at different stages of a project, including a ―planning matrix‖.
Chapter 6. Indicators - what do we (or the users) specifically want to know about projects?
     o Definition of indicators and presents the different type of indicator and the different types by objective
         level (quick reference table included).
     o Information concerning the criteria for indicator selection. The section also provides examples and
         technical considerations.
Chapter 7. Sources - where can we find the information we need? Sampling - how can we be sure information is
representative?
     o General and brief information on finding information, including the sources, how to collect the data,
         analyze it and disseminate it. The chapter also introduces the concept of ―bias‖.
     o Introductory information about sampling easily understandable without requiring a statistical background.
     o Discusses the different types of probability and purposeful sampling methods, but even though it mentions
         the issue of sample size, it doe not provide clear guidance on its calculation.
Chapter 8. Methods for gathering information - how to obtain the information desired?
      o Guidance for choosing between secondary and primary data sources, quantitative and qualitative data,
          participatory and non-participatory methods, verbal and less verbal approaches (related to the population
          literacy level).
      o Tables concerning participatory method examples and applications.
      o ―Comparison chart to facilitate selection‖ for primary data collection.
      o Triangulation methods.
Chapter 9. Analysis methods - how to understand and give meaning to raw data/information?
     o General steps for analyzing the data.
     o Information on specific quantitative techniques (descriptive and inferential statistics) and qualitative
         techniques (raw data, simple description, interpretation)
Chapter 10. And then what? - Presentation of findings and ensuring action
     o Guidelines for writing and presenting the evaluation findings, with a focus on CARE programs.
Chapter 11. Issues affecting internal project planning and operations related to M&E
     o Guidance on constructing an M&E plan, including logistics and criteria for a project information system.
Annexes: Glossary; Abbreviations (acronyms); Components of key documents in M&E system;
Methods [The following methods are discussed: participatory rapid toolkit; reviewing and analyzing secondary data;
social mapping (PRA); Historical mapping (PRA); Rapid social organization profile (PRA); Focus group
discussions; Semi-structured interviews; Qualitative interviews; Rapid surveys; Question design for surveys and
interviews; Group brainstorming (PRA); Ranking exercises (PRA); Strengths, weaknesses, opportunities,
limitations/threats (SWOL/T)], Alternative terminology for Log Frame elements; References.

(CARE 2002)
Caldwell, R. July 2002. CARE Project Design Handbook. 176p.
Web location:
http://www.ewb-international.org/pdf/CARE%20Project%20Design%20Handbook.pdf
Location:  Section A > CARE 2002
Key words, sectors: Design; General M&E

                                             Reviewer’s Synopsis:
This handbook is mostly focused on the project design process according to CARE. Contains illustrative figures,
examples and short review tables for chapters 2-6. Chapter 5 contains some information regarding the M&E plan,
but very general. The annex contains a one-page table with ―information for monitoring project process,‖ but does
not contain much guidance regarding M&E methodology.
                                                    Content:
Chapter 1: Introduction
     o Project DME cycle, defines ―project design‖ and the components of the project hierarchy (impact, effects,
         outputs, activities, inputs).
     o CARE project design and standards, and its rights-based approach (RBA)
Chapter 2: Holistic Approach


                                                                                                                23
                                                                                                     M&E Toolkit

    o    The holistic approach, and how such approach, combined with their RBA, affects CARE’s livelihood
         security approach.
    o The operating environment and diagnostic tools, with a very brief explanation of the following sample
         tools: PRA, benefits-harms ―profile‖ tools, stakeholder analysis, SWOT analysis.
    o Participant selection, needs assessment, wealth ranking, stakeholder and institutional analysis, gender
         analysis.
Chapter 3: Analysis & Synthesis Techniques in Design
    o Guidance on organizing the information obtained through the analysis presented in the previous chapter,
         including the use of causal analysis (with an explanation of the ―Pareto principle‖) for project design and
         the hierarchical causal analysis.
    o Methods discussed for causal analysis: group brainstorm/consensus, fishboning, cause and consequence
         analysis, causal tree.
Chapter 4: Focused Strategy
    o Guidance on selecting causes to be addressed for the causal analysis, assessing comparative advantage of
         the organization, developing the project hypothesis, logic models.
    o Tools choosing between intervention options: Multiple criteria utility assessment tool (MCUA), Pair-wise
         ranking matrix.
Chapter 5: Coherent Information Systems
    o Table comparing donor agency terminologies for the results/logical framework.
    o Project goals, operational definitions, indicators, benchmarks.
    o Guidance for developing project outputs, activities and inputs; and developing an M&E plan, including
         characteristics, how it is influenced by the RBA, a sample M&E system plan.
Chapter 6: Reflective Practice
    o Reflective practice and how to use it to improve the project design process.
Annexes: CARE International Project Standards; Operating Environment-Analysis Themes (table form); A Needs
Assessment Case Study; Wealth Ranking example; glossary of tools; Causal Analysis Practice Exercises; Applying
Appreciative Inquiry; Project Hypothesis; Goals and Indicators for an Agricultural Project; Goals and Indicators for
Small Business Development Project; Indicators at different levels in the project hierarchy; CARE’s Project
Outcome Model; United Way’s Program Outcome Model; Canadian Performance Framework; Classical 4x4
LogFrame; USAID Results Framework; Information for Monitoring Project Process; Practical Exercise: Writing
Clear and Precise Goals; Goal Statements and Operational Definitions; Glossary of Definitions and Acronyms;
Bibliography and suggested reading.

(CIVICUS)
World Alliance for Civic Participation (CIVICUS). Monitoring and Evaluation. 50 p.
Web location:
http://synkronweb.aidsalliance.org/graphics/NGO/documents/english/705a_Monitoring_and_Evaluation.pdf
Location:  Section A > CIVICUS
Keywords, Sector: General M&E, External evaluators
                                                 Reviewer’s Synopsis:
Quoting the resource: This toolkit deals with the nuts and bolts (the basics) of setting up and using a monitoring and
evaluation system for a project or an organization. It clarifies what monitoring and evaluation are, how you plan to
do them, how you design a system that helps you monitor and an evaluation process that brings it all together
usefully. It looks at how you collect the information you need and then how you save yourself from drowning in data
by analyzing the information in a relatively straightforward way. Finally it raises, and attempts to address, some of
the issues to do with taking action on the basis of what you have learned.
Manual could be used as a short introduction for M&E, since it is only 50 pages. Other resources will be needed to
draft the M&E plan.
                                                       Contents:
Basic Principles
     o M&E definitions, advantages and disadvantages of using external evaluations and how to select and
          evaluator or evaluation team.
     o Planning an evaluation, guidance on developing indicators, definition of qualitative and quantitative
          information.
     o Suggested processes to design a monitoring system.



                                                                                                                  24
                                                                                                         M&E Toolkit

    o    Description, advantages and disadvantages for the following methods: interviews, key informant
         interviews, questionnaires, focus groups, community meetings, fieldworker reports, ranking, visual/audio
         stimuli, rating scales, critical event/incident analysis, participant observation, self-drawings.
    o Other topics: Information on reporting, with a sample outline; dealing with resistance;
Best Practice
    o Examples of indicators for economic, social, and political/organizational development.

(CRS 1999)
Freudenberger,K. 1999. Rapid Rural Response Appraisal & Participatory Rural Appraisal (RRA/PRA). .
Catholic Relief Services. 119p
Web Location: http://www.crs.org/publications/pdf/Gen1199_e.pdf
Location:  Section A > CRS 1999
Keyword, sector: Methods (PRA, RRA)
                                                  Reviewer’s Synopsis
The purpose of this manual is to familiarize users with RRA and PRA methods, to demonstrate the applicability of
these methods to CRS funded projects, and to encourage the rigorous application of the methods in order to obtain
the best results.
This manual is a good tool when using the methods illustrated in Part III. Volume II of this publication is not
available electronically.
                                                         Content
VOLUME I
Part I: An Introduction to Information Gathering, Participatory Research, and RRA and PRA
     o Compares qualitative and quantitative methods; participatory and ―top-down‖ methods; RRA and PRA
          (including the uses)
Part II: How to Put Together an RRA or PRA to do Field Research
     o The importance of triangulation in RRA and PRA and how to triangulate (tools and techniques).
     o Different types of bias and tips on how to monitor it.
     o Guidance of conducting the RRA study: selecting the team (including the consultant); selecting the study
          objectives and study site: organizing the information needs and time, and conducting feedback sessions and
          team interaction meetings.
     o Guidance on conducting a PRA and information on related field activities.
     o Analyzing the data and reporting.
Part III: The Tools and Techniques Used to Gather Information in RRA and PRA
     o Introduction and guidance on the use of the following tools: semi-structured interviews; participatory
          mapping (with a short description of its variations: territorial, regional, family resource, historical, interest
          group and social maps); transect walk; Venn Diagram (and the variation called ―polarization‖ diagram);
          calendars; wealth ranking; historical profile; and matrices.
Postscript: Maintaining Flexibility, Creativity, and Your Sense of Adventure
VOLUME II (not available)
Part I: Using RRA and PRA for Sectoral Research
Part II: Case Studies from the Field
Appendices

(CRS 2004)
Stetson, V., G. Sharrock and S. Hahn. 2004. ProPack: The CRS Project Package. Catholic Relief Services.
222p.
Web Location: http://www.crs.org/publications/pdf/Gen0904_e.pdf
Location:  Section A > CRS 2004
Keywords, Sector: Design, General M&E
                                                 Reviewer’s Synopsis:
ProPack was written to assist in program design and proposal writing. Below is the content of the manual, but for
the purpose of this toolkit, emphasis is placed on its M&E content. The manual is written in a simple way, with step-
by-step guidance for project designing. It uses diagrams and field examples to illustrate some aspects of the topics,
but do not provide quick reference tables or any other visual aid to use as a memory refresher, as other manuals do.




                                                                                                                       25
                                                                                                    M&E Toolkit

The tools provided in Chapter IV are related to managing and organizing M&E, and not to the tools for data
collection.
                                                      Content:
Chapter I: Introduction to ProPack
Chapter II: Concept Notes
Chapter III Project Design Guidance
     o Topics include: planning the project design; stakeholder analysis; assessment plan, methodology and the
         use of baselines; using conceptual frameworks (the Integral Human Development Framework) to guide
         assessments; reviewing the strategy;
Chapter IV: Results Frameworks, Proframes and M&E Planning
     o The Proframe (their equivalent to a logframe combined with a results framework- used by USAID) and
         how M&E relates to each column.
     o General information on performance indicators, how to select, revise and measure them.
     o ―Critical assumptions‖ and their importance.
     o Presentation of some M&E planning tools, such as: the measurement method/data sources worksheet;
         performance indicator tracking table; baseline survey worksheet; M&E calendar.
Chapter V: Project Proposal Guidance
     o Proposal planning in CRS and planning for M&E.
     o Table comparing different languages used by other organizations on the logical planning framework.
Chapter VI Further Resources
     o Glossary and other project planning tools.

                                                         F
 (FAO 2003)
Food and Agriculture Organization of the United Nations. November 2003. Auto-Evaluation Guidelines,
Version 1.1. Evaluation Service (PREE) 50p.
Web Location: http://www.fao.org/docs/eims/upload/160705/auto-evaluation_guide.pdf
Location:  Section A > FAO 2003
Keyword, sector(s): General M&E, Auto-evaluation.
                                                   Reviewer’s synthesis
This guide is a response to FAO strengthening of their evaluation system. It deals specifically with Auto-Evaluation
(AE)- the review of program achievements over a longer period of time- of FAO programs. Although directed to
FAO programs contain good information regarding some techniques used during evaluations. However, the user will
need other sources of information to complete the evaluation design.
                                                         Content
Part I: Conceptual framework
     o The conceptual framework and terminology used in it (input, activity, biennial output, major output,
          outcome, programme entity objective, programme entity rationale) and reviews the consequences of an AE.
Part II: Procedures
     o Guidance through the AE procedure, including the scope, timing, staff roles during AE (Including a table
          summarizing the roles).
Part III: Planning for an Auto-evaluation
     o AE planning, including the following: how to define the issues to be evaluated with examples; deciding on
          methodologies, including a short introduction with definition, advantages and drawbacks of the following
          techniques: indicators, desk studies/annotated bibliographies, SWOT (strength, weaknesses, opportunities
          and threats) analysis, questionnaires survey, semi-structured interviews/focus groups, country case study,
          web statistics, expert panels, and a short mention of triangulation; Estimating the budget and drafting the
          ToR (Short sections)
Part IV: Managing an auto-evaluation
     o How to sequence the AE steps, with a sample timeline, and tips for reporting results.
     o AE quality standards.
Part V: Auto-evaluation techniques
     o More detailed discussion of the techniques used in an AE, including indicators; desk studies and literature
          surveys; SWOT analysis, with a matrix; semi-structured interviews; focus group interviews; questionnaire
          surveys, with a table illustrating the advantages and disadvantages of common surveying methods (face-to-



                                                                                                                  26
                                                                                                  M&E Toolkit

       face interviews, telephone interviews, feedback forms, mail surveys, email surveys, web surveys) and the
       types of questions used; web statistics; field visit and country case studies; peer review panels.
Annexes: Outline of ToR; outline of auto-evaluation report; bibliography.

 (FOCUS 2000)
Adamchack, S., K. Bond, L. MacLaren, R. Magnani, K. Nelson and J. Seltzer. 2000. A guide to monitoring
and evaluating adolescent reproductive health programs. FOCUS on Young Adults, Tool Series 5.
Web location: Part I: (285p) http://pf.convio.com/pf/pubs/focus/guidesandtools/PDF/Part%20I.pdf, Part II (179p):
http://pf.convio.com/pf/pubs/focus/guidesandtools/PDF/Part%20II.pdf
Location:  Section A >  FOCUS 2000 > (Part 1, Part 2)
Keyword, sector: Health, General M&E
                                                  Reviewer’s Synopsis
The intended audiences for this guide are program managers who monitor and evaluate adolescent reproductive
health (ARH) programs, but the concepts presented can be used for M&E in other sectors. The guide contains a
useful table with guidance as to how to use it in Part 1, page 70. This guide is a great source of information
concerning study design, cluster sampling and sample size. Part II of the guide is very specific to ARH programs,
containing their data collection instruments, as listed below.
                                                         Content
Introduction
     o Importance of M&E in youth programs and to M&E in general.
Part I: The How-To’s of Monitoring and Evaluation
1: Concerns about monitoring and evaluation ARH (Adolescent Reproductive Health) Programs
     o Challenges of M&E in ARH programs and how to overcome them. These can be generalized to other
          sectors.
2: A framework for ARH Program M&E
     o Information concerning ARH and related programs.
3: Developing an ARH M&E Plan
     o Defining program goals, outcomes, and objectives, and the scope of work for M&E.
     o M&E usefulness for different actors (program staff, funding agencies and policymakers, and communities
          and youth) and how does M&E fit into different stages of the program.
     o Preparing the M&E budget, and involving the youth in the M&E process.
     o Provides a chart to use as guidance as to which M&E effort should be undertaken and a table as to how to
          use the guide (p.70, part I)
4: Indicators
     o Defines and explains the different types of indicators (design, system development and functioning,
          implementation, and outcome indicators)
     o Guidance on selecting indicators
5: Evaluation Designs to assess program impact
     o Rationale for conducting impact evaluations
     o Review of the different designs: randomized, quasi-experimental, non-experimental, panel studies
          (including a summary table for the first three)
     o Information on hiring external consultants and on minimizing threats to external validity.
6: Sampling
     o Probability and non-probability sampling methods, focusing more on the process of sampling in clusters
          (step-by-step guidance)
     o Selecting a sample size, including a ―sample size table‖ (Appendices provide also more information).
7: Data Collection and the M&E work plan
     o Guidance on preparing for data collection (including on data collector’s training) and consideration of
          ethical issues
     o Data collection methods (In part II, listed below, by chapter), including a table summarizing key points for
          each; guidance on selecting a method (including comparison between qualitative and quantitative methods
          and how to blend them), approaching the data collection, and developing a work plan.
8: Analyzing M&E Data
     o Processing M&E data
     o Presentation of qualitative data, such as case studies, process analysis, causal flow charts, a taxonomy.




                                                                                                                27
                                                                                                      M&E Toolkit

    o     Analysis of quantitative data, including descriptive and inferential statistics, and a table with the methods
          used to analyze the data (tabulating, cross-tabulation, aggregating, disaggregating and projecting)
9: Using and disseminating M&E results
     o Rationale and guidance for M&E result dissemination, with tips for writing a press release and presenting
          the results orally.
10: Tables of ARH Indicators
     o Tables with indicators, data sources and instruments for collection.
Appendices: Sampling schemes for Core Data Collection Strategies; How to calculate Sample Size Requirements.
Part II: Instruments
This section contains the instruments listed below, specific to ARH programs.
1: Checklists
2: Tally Sheets
3: Reporting Forms
4: ARH Coalition Questionnaire
5: Composite Indices
6: Inventory of facilities and services
7: Observation guide for counseling and clinical procedures
8: Interview guide for staff providing RH Services
9: Guide for client exit interview
10: Questionnaire for debriefing mystery clients
11: Community questionnaire
12: Comprehensive youth survey
13: Focus group discussion guide for in-school adolescents
14: Assessing coalition effectiveness worksheet
15: Parents of youth questionnaire

                                                          I
(IFAD)
International Fund for Agricultural Development (IFAD). A guide for project M&E: Managing for Impact in
Rural Development.
Web Location: http://www.ifad.org/evaluation/guide/index.htm (Downloadable in zip or by section, and in different
languages: Arabic, French, Spanish, English)
Location  Section A >  IFAD > (individual sections)
Key Words, sectors: General M&E, capacity building, participatory M&E, training
                                                  Reviewer’s Synopsis
The guide is intended for managers, M&E staff, consultants and IFAD and cooperating institution staff, but it
appears more relevant for managers. From the web location, one can access a ―navigating guide‖, which portrays
how the guide can be used by the different types of audience. It contains figures and example narratives to illustrate
the concepts and further reading for each section. Annex D (see below) contains an overview and tips for 34 M&E
methods.
                                                       Content
Section 1: Introducing the M&E Guide
     o Information mostly relevant to IFAD and other information on key stakeholders and their role in M&E.
Section 2: Using M&E to Manage for Impact
     o Focused on management. It provides an overview and basic M&E definitions.
     o Elements of managing for impact, highlights the learning opportunities during the project cycle and the
         methodological, communication and information quality implications related to the project cycle.
     o The M&E system, participatory M&E
     o Creating learning environment/opportunities.
Section 3: Linking Project Design, Annual Planning and M&E
     o Design adaptation during the project cycle, good practices for project design, capacity building and
         sustainability (including indicators for M&E),
     o Explains the use of the logical framework approach (LFA), including a chart about how to write it and
         another with options for adjusting the structure of the matrix, detailed information regarding project
         assumptions in the matrix.
     o How design influences M&E.


                                                                                                                    28
                                                                                                     M&E Toolkit

Section 4: Setting up the M&E system
     o Setting up the M&E system including key task during the project cycle, set-up steps, documenting the
          M&E plan, updating the plan.
     o Overview of performance questions, critical reflection schedule, reporting, M&E conditions and capacities,
          assessing the quality of the M&E plan.
Section 5: Deciding what to Monitor and Evaluate
     o Different information needs and indicators depending on the stakeholders, and how to update it.
     o Summary table with the content of the M&E matrix and how to use it.
     o More detailed information on the use of performance questions.
     o Indicators, including how to work with qualitative information and indicators, how to check the indicator’s
          quality.
     o Baselines, options for making comparative analysis (including the mention of control groups), alternatives
          to baselines.
Section 6: Gathering, Managing and Communicating Information
     o Data transfer throughout the project.
     o (More detailed information in annex D) Considerations when choosing appropriate M&E methods,
          including qualitative and quantitative methods, individual and group methods, participatory methods.
     o Collecting the data, common errors during data collection, data recording and verification.
     o Collating information for analysis, analysis of data (including qualitative and quantitative), storing M&E
          information, communicating M&E findings, feedback sessions
Section 7: Putting in Place the Necessary Capacities and Conditions
     o Building the necessary capacities for the team, incentives, budgeting to support the M&E team.
     o Overview of the capacities needed for M&E, including for participatory M&E.
     o Developing an M&E training plan, organizing M&E structures and responsibilities, using the project
          system for managing information (including the pros and cons of a computerized system), financial
          allocations, .
Section 8: Reflecting Critically to Improve Action
     o Critical reflection and its role in M&E.
Annexes: A. Glossary of M&E concept and terms; B. annotated example of a project logframe matrix and logframe
explanation; C. annotated example of an M&E matrix; D. Methods for monitoring and evaluation (Methods
included: random and non-random sampling; stakeholder analysis; document review; direct observation; cost-benefit
analysis; questionnaires and surveys; semi-structured interviews; case studies; methods for groups- brainstorming,
focus groups, simple ranking, SWOT, dreams realized or visionary, drama and role plays; methods for spatially-
distributed information- mapping, transects, GSI mapping, photographs and video; methods for time based patterns
of change- diaries, historical trends and timelines, seasonal calendars, most significant change; methods for
analyzing linkages and relationships- mind maps, impact flow diagram, institutional linkage diagram, problem and
objectives trees, M&E wheel, systems diagram; methods for ranking and prioritizing- social mapping, matrix
scoring, relative scales or ladders, ranking and pocket charts); E. sample job descriptions and terms of reference for
key M&E tasks.

(IFRC 2002)
IFRC. 2002. Handbook for Monitoring and Evaluation, 1st edition. 163p.
Ibrahim Osman, M&E Division, October 2002
Web location: http://www.ifrc.org/cgi/pdf_evaluation.pl?handbook.pdf
Location:  Section A > IFRC 2002
Key words, sectors: General M&E, Emergency programming, Development programming.

                                               Reviewer’s Synopsis
The objective of the Guidelines is to provide International Federation, staff of National Red Cross and Red
Crescent Societies and implementing partners with the necessary information to be able to design, manage and
support a results oriented monitoring and evaluation system for emergency and development operations.
The guide is written in a simple way. The modules are self-contained. This guide could be useful at the field level
and as a general introduction for conducting M&E at the IFRC, since it provides information on organization
standards. The tables and figures contained in the guide allow for anyone in need of a quick reference to find the




                                                                                                                  29
                                                                                                  M&E Toolkit

information needed in a concise format. It should be complemented by the IFRC Evaluation Framework located in
Section B of this toolkit.
                                                        Content
Module 1: Overview
     o Monitoring, evaluation, results-based management, the result chain (the operational sequence to achieve the
         desired goal), and the logical framework or logframe matrix.
     o Emphasis on the central importance of the logframe for deriving M&E questions for a ―robust‖ system.
Module 2: Monitoring and Evaluation
     o Step by step preparation of an M&E plan, including: Checking the operation design; Assess M&E capacity;
         Data collection and analysis plan; M&E budget and plan preparation; Reporting and feedback plan
     o SMART indicators (Specific, Measurable, Accurate, Realistic, Timely) and guidance as to how to chose
         them, taking into account the possibility of an information ―overload‖.
     o Examples of the ―Beneficiary Contact Monitoring‖ (BCM) indicators, used to monitor beneficiaries’
         perception of the operation.
     o Field visits, computerized monitoring systems, assessing training needs, baseline data,
Module 3: Monitoring and Reporting National Society Programmes
     o Complements the information from Module 2 by providing specific information on how to design an M&E
         plan for NS and co-operation agreements.
Module 4: Monitoring and Reporting
     o M&E considerations for development and emergency operation project cycles, including specific
         considerations for developing a logframe for different emergency stages and development operations.
     o Defining the roles of the staff and partners during operations (including refugee, development, conflict
         situations, and slow and quick onset emergency operations.
     o Data collection and analysis in relation to emergency and development operations.
     o ―Suggested checklist for rapid appraisal‖, reporting and management use of the reports, and more on
         developing the M&E plan and budget.
Module 5: Evaluation
     o Planning, organizing and using IFRC evaluations.
     o Types of evaluations (self-evaluation, non-mandatory evaluation and mandatory evaluation). Also
         introduces terms such as mid-term, terminal and real-time evaluations.
     o IFRC evaluation standards and key principles.
     o Framework for National Society self assessment and information on self evaluations.
Module 6: Steps in Planning and Managing an Evaluation
     o Detailed information on plan including timeline and checklist for: Clarifying/agreeing on the need for the
         evaluation; Planning the evaluation; Preparing the terms of reference (TOR); Criteria for selecting the
         evaluation team; Conducting a desk review; Conducting the evaluation (including guidelines as to what to
         do before and during the mission); Preparing the evaluation report; Disseminating the evaluation; Use of
         the results and learning from the evaluation
     o Example and guidance for developing key issues for the evaluation
     o Basic information on data collection methods, including methods for document review, consultation,
         facilitation, surveys, observation.
     o Some information on participatory techniques: their importance and considerations.
     o Specific information concerning field travel
     o The ―aide memoir‖, debriefing workshop, the Recommendation Tracking System (RTS), issues affecting
         evaluation’s effectiveness.
Module 7: Baseline Studies
     o Baseline studies, their importance and common problems associated with them.
     o Baselines for different types of situations (including emergency and development).
     o How to plan and manage a baseline study, answering the questions: is a baseline survey required, who
         should do the baseline survey and when, who should be studies.
     o The ―comparison group‖, controlling the ―year effect‖, primary and secondary data collection, site visit
         selection, work plan and budget preparation, analyzing and reporting the data (including how to use the
         results and their link with monitoring), and follow-up surveys.
Module 8: Tools for Data Collection
     o Data collection and analysis, and tools to use during the process.
     o Definition on related concepts such as data, accuracy, precision, bias, ―optimal ignorance‖.


                                                                                                               30
                                                                                                          M&E Toolkit

    o  Comparison between qualitative and quantitative methods.
    o  Basic overview and definitions of data collection techniques including: rapid appraisals methods (key
       informant interviews, focus groups, community interviews, direct observation, mini-surveys, visualization
       tools); Participatory appraisals concepts, including triangulation; Community inventory; Sample survey;
       Beneficiary Contact Monitoring (BCM), including techniques to monitor BCM indicators; Field visits-
       more in-depth look at this technique; Secondary data use.
Module 9: M&E Glossary

                                                            M
 (MEASURE 2005)
Gage, Anastasia J., Disha Ali, and Chiho Suzuki. (2005). A Guide for Monitoring and Evaluating
Child Health Programs. MEASURE Evaluation. Carolina Population Center, University of North
Carolina at Chapel Hill. 422p.
Web location: https://www.cpc.unc.edu/measure/publications/pdf/ms-05-15.pdf
Location: Not included (file is too big)
Keywords, sector(s): Child Health
                                                 Reviewer’s Synthesis
This guide is designed for child health programs M&E, therefore its intended audience is professionals involved in
this type of program. It contains some general M&E information, but it is mostly sector focused, including a
summary list of indicators for child health programs and detailed information for each indicator. Each chapter
follows roughly the same structure, including a list of indicators discussed and references.
                                                        Content
Chapter 1. Overview
     o Introductory information to M&E, including definitions and types of evaluations (formative, summative,
         adequacy, plausibility and probability evaluations)
     o M&E within the project phases
     o Introduction and example of a conceptual framework.
     o Indicators and their selection
     o Data sources for child health program M&E, including a brief introduction to the following: Population-
         based surveys (USAID Demographic and Health Survey, or DHS; UNICEF Multiple Indicator Survey, or
         MICS; 30-Cluster Survey; Rapid Core Assessment Tool on Child Health, or CATCH); Lot Quality
         Assurance Sampling (LQAS); Census-based Household Information Systems; Sample Vital Registration
         with Verbal Autopsy (SAVVY); Health system data (Routine Health Information System Data; Health
         Facility Survey; Supervisory-based Data Collection; Self-assessment and Peer-Assessment methods;
         program review); Qualitative data (in-depth interviews; observation; document review).
Chapter 2. Prevention of Mother-to-Child Transmission (PMTCT) of HIV
     o M&E for PMTCT programs including methodological challenges, detailed indicator information
         (definition, measurement tools, what it measures, how to measure it and strengths and limitations) and
         terminology.
Chapter 3. Newborn Health
     o Newborn programs including M&E methodological challenges, detailed indicator information (definition,
         measurement tools, what it measures, how to measure it and strengths and limitations).
Chapter 4. Immunization
     o Immunization programs including data sources, M&E methodological challenges, indicator selection and
         detailed information (definition, measurement tools, what it measures, how to measure it and strengths and
         limitations).
Chapter 5. Integrated Disease Surveillance and Response
     o Surveillance systems including M&E methodological challenges and detailed indicator information
         (definition, measurement tools, what it measures, how to measure it and strengths and limitations).
     o Includes an Annex with the following documents: WHO-recommended case definitions for reporting
         selected suspected priority diseases from the health facility to the district; WHO-recommended types of
         surveillance for vaccine-preventable diseases; Sample form for recording timeliness and completeness of
         monthly reporting from the health facility to the district; Generic case-based reporting form – from health
         facility/worker to district health team; Generic line list – for reporting from health facility to district and for
         use during outbreaks; Sample supervisory checklist for surveillance and response activities at the health
         facility.


                                                                                                                        31
                                                                                                    M&E Toolkit

Chapter 6. Integrated Management of Childhood Illness (IMCI): Improved Health Worker Skills
   o Information on IMCI, focused on indicators for M&E of health facility-based interventions.
   o Information on the Service Provision Assessment (SPA) tool.
   o Challenges for M&E of health worker skills and detailed indicator information (definition, measurement
        tools, what it measures, how to measure it and strengths and limitations).
   o Annex with additional Indicators for IMCI at the Health Facility Level
Chapter 7. Diarrhea, Acute Respiratory Infection, and Fever
   o Detailed information regarding the prevention, home case management, care seeking, morbidity, and
        impact indicators related to diarrhea, ARI and fever including definition, measurement tools, what it
        measures, how to measure it and strengths and limitations.
Chapter 8. Growth Monitoring and Nutrition
   o The causes of child malnutrition and interventions, based on the UNICEF framework.
   o Methodological challenges for the Infant and young child M&E
   o Indicator selection and detailed information (definition, measurement tools, what it measures, how to
        measure it and strengths and limitations).
Chapter 9. Mortality
   o Information on mortality and measurement challenges.
   o Detailed information on indicators (definition, measurement tools, what it measures, how to measure it and
        strengths and limitations).

(Merci Corps 2003)
Merci Corps. March 2003. Design, Monitoring and Evaluation Guidebook. 63p.
Web Location: http://www.mercycorps.org/files/file1137798118.pdf
Location:  Section A > Merci Corps 2003
Keywords, sectors: General M&E
                                              Reviewer’s Synopsis
Guide presents an introductory view of Merci Corps’ DM&E guidelines, with sample documents in the appendices.
Not a source for methods.
                                                    Content
1. Fundamentals of project design
    o Design tools like the logical framework, and the indicator and work plans.
    o Eight steps to a ―goal-oriented‖ project design.
2. Sounding monitoring management
    o Monitoring and its rationale, including participatory monitoring and communicating monitoring data.
3. Criteria for a useful evaluation
    o Differentiates monitoring from evaluation
    o Presents the different types of evaluation options (internal, external, and participatory) presented also in a
          table.
    o Guidance on when and what to evaluate, preparing the scope of work, and reporting.
Appendices: Sample Logical Framework; Sample Work Plan; Sample Indicators Plan; Evaluation Scope of Work
Template; Comparison of Mercy Corps and Donor Terminology; Sphere Guidelines and the Mercy Corps DM&E
Guidebook; The DM&E Checklist

                                                        S
(SAVE)
Howard-Grabman, L and G Snetro. How to Mobilize Communities for Health and Social Change. (Ch. 6-
Evaluate Together). Save the Children. 32p.
Web Location: Entire manual and other SAVE resources can be found in:
http://www.savethechildren.org/technical/resources.asp
Location:  Section A > SAVE Ch6 (Only Chapter 6: Evaluate Together)
Keyword, Sector: Participatory M&E, Program design
                                              Reviewer’s Synopsis
This guide shows the step to engaging the community in the project from initial assessment to evaluating. It provides
general guidance and, although a good tool for community mobilization, it does not provide many resources for
evaluation methodology and tools.



                                                                                                                  32
                                                                                                  M&E Toolkit

                                                       Content
Chapter 1: Prepare to Mobilize
   o Guidance on selecting the community and the health issue to work on, identifying resources and
        constraints, developing a mobilization plan and the team.
Chapter 2: Organize the Community for Action
   o Guidance for engaging the community into participation, including the factors that affect participation.
Chapter 3: Explore the Health Issue and Set Priorities
   o How to engage the community in exploring health issues and getting valuable information from them.
Chapter 4: Plan Together
   o Rationale for involving the community in planning and how to consult the community in the planning
        phase.
Chapter 5: Act Together
   o Engaging the community in the project implementation, including how to monitor the community progress.
Chapter 6: Evaluate Together
   o General information on forming the evaluation team, identifying the stakeholders and drafting the
        evaluation plan.
Chapter 7: Prepare to Scale Up
   o Information about expanding the program, including the scaling-up assumptions, assessing the community
        potential, and case studies.

 (SAVE 2006)
Goslin, L. and M. Edwards. 2006. Toolkits: A Practical Guide to planning, monitoring, evaluation and impact
assessment. Save the Children. 341p.
Web Location: Not Available
Location: Hardcopy only
Keywords, Sectors: General M&E, Development, emergency, advocacy programs; external consultants.
                                                 Reviewer’s Synopsis
As the title goes, this book is a very practical and easy to use guide for program planning and M&E. It provides
good explanations with examples from Save the Children programs. The main highlight of this guide is that it
encompasses three program types: development, emergency and advocacy- something not many guides have.
Section three provides tools for program planning and M&E, but more resources might be needed for using dome of
the tools.
                                                       Content
Section one: Underlying principles
Chapter 1: The use of Assessment, Monitoring, Review and Evaluation in Programme Design and Management
     o The program cycle components and the program ―spiral‖
     o Right-based development and how it contrasts with the ―needs-based approach.
Chapter 2: Involve the Relevant People
     o Guidance on selecting who will participate in the evaluation and when to include them.
     o Information on possible stakeholders and their benefits from the evaluation and partnerships.
     o Advantages and disadvantages of including outsiders in the M&E process.
Chapter 3: Recognize and Deal with Differences and Discrimination
     o Guidance on identifying the different groups in the community and the relationship among them to avoid
          discrimination.
Chapter 4: The Systematic Collection and Analysis of Information
     o Bias and how to minimize it
     o Short summary of types of sampling
     o Qualitative, quantitative and participatory data collection methods.
Section two: Practical questions
Chapter 5: Questions to Consider When Planning Assessment, Monitoring, Review, Evaluation, or impact
assessment
     o Introduction to the Terms of Reference
     o Guidance questions to developing the following: the aim of the exercise; who is it for and how will they use
          the results; objectives and key questions of the exercise; information collection and analysis
          Ways of presenting the results; organization




                                                                                                                33
                                                                                                  M&E Toolkit

    o    Table with tasks and responsibilities and information on the uses of computers (including GIS, databases,
         management information systems, among others)
Chapter 6: Assessment and Programme Planning
    o Guidance through the initial assessment process
    o Indicators and baseline studies
Chapter 7: Monitoring
    o Different types of monitoring and how to design a monitoring system
    o Introduction to data collection tools for monitoring: surveys, participatory methods, measuring skills and
         knowledge, record keeping (forms and diaries), supervision checklists and reports, project visits, case
         studies, spot checks
Chapter 8: Review and Evaluation
    o Contrast between review and evaluation
    o Questions to consider when carrying out a review or an evaluation: the purpose and aim, who is it for and
         how will they use the results, objectives and key questions, information collection and analysis, ways of
         presenting the results, organization.
    o Information on external reviewers and evaluators
Chapter 9: Impact Assessment
    o Theoretical and practical information on impact assessment
    o Defining a model of change and developing key questions to assess it.
    o Sample indicators for the following areas: material wealth, social well-being or human capital,
         empowerment or political capital.
    o Information on assessing unexpected and negative impacts.
    o Areas to take into consideration when assessing impacts: sampling, timing, lack of baseline, attribution,
         cross-checking results.
Chapter 10: Planning, Monitoring, Review and Evaluation in Emergency Situations
    o Overview of the SPHERE Project
    o Constraints and compromises in emergency situations and how to deal with them.
    o Guidance on defining the objectives for emergency assessments, disaster preparedness plans, relief
         programs, monitoring, reviews and evaluations.
    o Key questions and guidance on selecting indicators for emergencies related to mortality, morbidity, water
         supply and sanitation, nutrition, food aid, shelter and site planning, health services, psychosocial well-
         being, child protection and education.
    o Data collection methods and analysis in emergencies, including documents review, rapid appraisals,
         surveys, information systems (including strengths, weaknesses and prerequisites).
    o Ways of communicating the information obtained.
Chapter 11: Planning, monitoring and evaluating advocacy
    o Guidance for advocacy program planning and M&E, including challenges.
Section three: Tools
Tool 1: Participatory Learning and Action (PLA)
    o Description; features; methods used (secondary sources, direct observation, interviews, focus groups, oral
         history, ranking and scoring, diagrams and maps, techniques for children); analyzing the results; variants
         (rapid rural appraisal, rapid assessment procedures); strengths and weaknesses; prerequisites.
Tool 2: Surveys
    o Description, design, data collection and analysis, strengths, weaknesses, prerequisites and when they should
         be used.
Tool 3: Logical Framework Analysis (LFA)
    o Description, how to carry it out, uses for M&E, strengths, weaknesses, prerequisites, when it should be
         used, variation (the logic model)
Tool 4: Cost-effectiveness (and cost-benefit) Analysis
    o Description, methods for calculating (cost-effectiveness ratio, non-quantifiable benefit rating), strengths,
         weaknesses, and when to use.
Tool 5: Strengths, Weaknesses, Opportunities and Constraints (SWOC) Analysis
    o Brief description
Tool 6: Setting Objectives
    o The problem tree: description, how-to, and when to use
    o The objectives tree: description, how-to, and using the results.


                                                                                                                34
                                                                                                 M&E Toolkit

Tool 7: Example of Evaluating Participation
    o Description, indicators, strengths, weaknesses, and prerequisites
Tool 8: Using Consultants
    o Description, the role of the consultant, key questions to ask and guidance on hiring, drawing up a
         consultant brief, managing the consultant, sample agreement.
Tool 9: Programme Visits
    o (As a monitoring tool) Description, planning a visit, strengths and weaknesses of visits, prerequisites.
Tool 10: Presentation and sharing information: meetings, diagrams, video and theater
    o Meetings: description, strengths, weaknesses, prerequisites.
    o Diagrams: description, types (social and resource mapping, time lines, impact diagrams, spider diagrams),
         strengths, weaknesses, prerequisites.
    o Video: description, uses, weaknesses, strengths, prerequisites.
    o Theater: description, how-to, uses, strengths, weaknesses, and prerequisites
Tool 11: Training and development in planning, monitoring, review, evaluation and impact assessment
    o Description, related issues, developing a strategy to improve planning, monitoring and evaluation (PME),
         strengths and weaknesses of different sources of training; prerequisites.
Tool 12: Stakeholder analysis
    o Description, how-to according to purpose (to decide who should be involved in the project cycle and for
         planning advocacy)
Tool 13: Frameworks to help analyze the advocacy process
    o Frameworks for M&E policy change and implementation, M&E of the capacity for advocacy, M&E of
         networks and movements.
Tool 14: Frameworks for developing M&E questions in emergency situations
    o Related to chapter 10.
Further Reading
Glossary

(SAVE 2004)
Ducan, J. 2004. Children in crisis: Good practices in evaluating psychosocial programming. The
International Psychosocial Evaluation Committee and Save the Children Federation, Inc. 131p.
Web Location:
http://www.savethechildren.org/publications/Good_Practices_in_Evaluating_Psychosocial_Programming.pdf
Location:  Section A > SAVE 2004
Keywords, sector: General M&E, Psychosocial, humanitarian programs
                                              Reviewer’s Synopsis:
This manual is a DM&E guide to psychosocial programming. It introduces psychosocial programming and
addresses evaluation using case examples and basic M&E concepts.
                                                     Content:
Chapter 1: Complex Emergencies and Psychosocial Development
     o Introduces the impact of complex emergencies in children and communities.
Chapter 2: What Is Psychosocial Programming?
      o Introduces all aspects of psychosocial programs, such as targeting, different approaches and how to fit
         psychosocial programming into other programs.
Chapter 3: Evaluating Psychosocial Projects: Overarching Principles and Project Logic
Models
     o Short introduction to evaluation in complex emergencies and in psychosocial programs.
     o Introduction to the logic model and key definitions (input, output, outcomes, impact)
Chapter 4: Fundamental Goals of Psychosocial Programming and Defining Objectives
and Indicators
     o Overview of psychosocial programming goals and objectives.
     o Introduces indicators (including proxy indicators) and how to select them, for psychosocial programs. Also
         provides information on using indicators to monitor the program.
     o Short sections defining qualitative and quantitative data.
Chapter 5: Identifying Data Sources and Methods of Data Collection




                                                                                                              35
                                                                                                    M&E Toolkit

     o   Introduces the following methods (definitions) and provide examples of data sources through psychosocial
         program samples: Interviews (open-ended or focus groups; key informant; unstructured or semi-structured,
         structured, self-report), ethnographic techniques (participatory and systematic observations; participatory
         appraisal), direct observation techniques (with common ways of recording: narratives; event records;
         interval recording), back translation of scales, triangulation.
Chapter 6: Project Impact Evaluation
    o Impact evaluations in emergencies and research design options, including the different types of
         ―comparison groups‖.
    o Research design options for the following projects: Partial coverage projects (randomized experiments,
         quasi-experimental design), full coverage projects (nonequivalent control group design, repeated measures
         design, simple before and after design, cross-sectional (―non-uniform‖) design).
    o Explains the issue of sampling, including probability and non-probability sampling.
Chapter 7: Designing a Psychosocial Project and Building an Evaluation Strategy
    o Guidance questions for designing a psychosocial project and developing outcome and impact evaluations.

 (SFCG 2006)
Church, C. and MM Rogers. 2006. Designing for Results: Integrating Monitoring and Evaluation in Conflict
Transformation Programs. Search for Common Ground. 244
Web Location: Part 1 (Chapter 1-8): http://www.sfcg.org/documents/manualpart1.pdf; Part 2 (Chapter 9-12):
http://www.sfcg.org/documents/manualpart2.pdf)
Location:  Section A >  SFCG 2006 > (part 1 and part 2)
Keywords, sector: Peacekeeping, General M&E, external evaluators
                                                 Reviewer’s Synopsis
This manual is focused on the particular needs of the conflict transformation field. It addresses the many challenges
faced by conflict transformation practitioners in their attempts to measure and increase the effectiveness of their
work with practical tips and examples from around the world.
                                                        Content
Chapter 1: Learning
     o The adult’s learning process and the importance of learning in peacebuilding operations.
Chapter 2: Understanding Change
     o Different theories of change related to peacebuilding, and how to use them.
Chapter 3: Program Design
     o Guidance of setting a goal, defining objectives and selecting activities and outputs.
     o Comparison between logical and results frameworks.
     o A ―donor Terminology Decoder‖ table included.
Chapter 4: Indicators
     o Information on indicator’s components, quality.
     o Compares qualitative and quantitative indicators
     o Guidance to developing indicators (mostly related to peacebuilding)
Chapter 5: Baseline
     o Contrasts conflict assessment and baseline studies
     o Uses of baselines
     o Information on the baseline plan (with an example)
     o Short section on what to do if there is no baseline
Chapter 6: Monitoring
     o Contrast monitoring and evaluation
     o Information on context and implementation monitoring, and on using monitoring information.
Chapter 7: Evaluation Introduction – 92
     o Overview of the evaluation stages
Chapter 8: Evaluation Preparation: Stage One
     o Information on who should be included in this stage
     o Includes an ―evaluation preparation decision flow chart‖
     o Guidance on deciding the evaluation objectives, who will read the evaluation, the type of evaluation
         (formative, summative or impact), the role of the evaluator, the evaluation approach, (with a decision
         flowchart), the scope of the evaluation, the qualifications of the evaluator (including deciding between
         internal or external evaluators), when to do the evaluation, and the evaluation cost.


                                                                                                                  36
                                                                                                  M&E Toolkit

Chapter 9: Evaluation Management: Stage Two
   o Guidance on developing a terms of reference (TOR), the evaluation plan, and working with an external
        evaluator.
   o Presents common problems related to this stage.
Chapter 10: Evaluation Utilization: Stage 3
   o Information on how to use the results of the evaluation.
Chapter 11: Ethics in Evaluation
   o Presents the common ethical challenges in design, baselines, evaluations.
   o Short paragraph concerning external evaluators
Chapter 12: Methods
   o Key terms concerning methodology
   o Comparison between qualitative and quantitative data
   o Methods explained (one-paragraph explanation and a table with strengths, weaknesses and cost): direct
        observation; interviews; focus groups; participant diaries; photography/video; project document review;
        questionnaire; secondary data review; survey; testing; participatory learning and action techniques, or PLA
        (Venn Diagrams, Pairwise ranking, conflict mapping, drawing, role playing.
   o Guidance on selecting the methods
   o Mention of triangulation
   o Table with ―data availability, method difficulty and reliability‖ for the methods described above.
   o Short paragraph on analysis.
   o Tools unique to peacebuilding programs (one paragraph explanation): activity interview, cognitive social
        capital assessment tool (CSA), media content analysis tool, case study, capacity enhancement needs
        assessment (CENA), Four levels of training evaluation.
Appendices: Sources for the Terminology Decoder; Evaluation Terms of Reference (TOR) Circulation Options

                                                          U
(UNDP 2002)
United Nations Development Programme. 2002. Handbook on Monitoring and Evaluating for Results.
Web Location: http://stone.undp.org/undpweb/eo/evalnet/docstore3/yellowbook/documents/full_draft.pdf
Location:  Section A > UNDP 2002
Keywords, sectors: General M&E; joint evaluations
                                                 Reviewer’s Synopsis:
The intended use of this manual is to improve UNDP programs and policies by strengthening the results-oriented
M&E, focused on development country programmes and on outcome M&E. ―How to‖ chapters are specific to
UNDP programmes. Not a good source of information for methodology information or tools applicable outside of
UNDP.
                                                       Content:
Part I: The Monitoring and Evaluation Framework
Chapter 1: Purposes and Definitions
     o Define what is ―results-oriented Monitoring and evaluation‖ and its objectives
     o Definition of other terms such as reporting, feedback, lessons learned.
Chapter 2: The Shift towards Results-Based Management
     o Defines results-based management (RBM) and highlights the main features of a results-based M&E
          system, in the context of RBM.
     o Explains what is outcome monitoring and how it contrasts with implementation monitoring.
     o Outcome evaluation and its relationship with outcome monitoring.
     o The importance of partnerships in programming and their roles in M&E, ―soft‖ assistance (its significance
          and M&E), how is the country office affected by the shift to RBM, RBM challenges for program managers.
Part !!: How to Conduct Monitoring and Evaluation
Chapter 3: Planning for Monitoring and Evaluation
     o Guidance as to how to develop a logical planning framework for M&E for country programmes and how to
          develop a work plan.
     o Planning a monitoring system, including information about arrangements during the project and
          information for the country office preparation.
     o Planning for an outcome evaluation.
Chapter 4: The Monitoring Process (―how to…‖)


                                                                                                                37
                                                                                                  M&E Toolkit

    o    A ―good monitoring system‖, the scope of monitoring, who is responsible for monitoring (in the UNDP
         organization framework).
     o Guidance and information on selecting the right monitoring tools. The tools discussed in the chapter
         include: field visits, annual project report, outcome groups, and annual review.
Chapter 5: The Evaluation Process (―how to…‖)
     o Introduction to the methodology of an outcome evaluation and suggestions to improve evaluations.
     o Information on how to manage an evaluation, including data collection, backstopping and feedback,
         reporting and following up. The data collection section is not helpful in providing in-depth information.
     o Information regarding joint evaluations
     o Involving partners and stakeholders, defining the scope of the evaluation, drafting the TOR (terms of
         reference), budgeting and factors affecting the costs, organizing the relevant documentation, evaluation
         focal team formation and selection.
PART III: Monitoring and Evaluating Performance
Chapter 6: Performance Measurement
     o UNDP’s common rating system to rate performance at the results level.
     o Guidance on indicator selection, including information on what to do when there is no baseline, using
         proxy indicators, disaggregated data.
     o Information on using indicators for results-oriented monitoring (output, outcome, impact).
PART IV: Use of Monitoring and Evaluation Information
Chapter 7: Knowledge and Learning: Use of Evaluative Evidence
     o Guidance to apply M&E information for improved performance, decision-making and learning.
     o The feedback process for country officers and applying the recommendations from the feedback.
Resources and Annexes: Glossary; related documents and websites; monitoring and evaluation tools (evaluation and
tracking plan; evaluation terms of reference; annual project report; field visit report; menu of monitoring tools)

 (UNFPA 2004)
The United Nations Population Fund. 2004. The Programme Manager’s Planning, Monitoring and
Evaluation Toolkit.
Web Location: http://www.unfpa.org/monitoring/toolkit.htm (Sections downloaded separately, either PDF or MS
Word format, in English, Spanish, Arabic and French)
Location:  Section A >  UNFPA 2004 > (individual sections)
Keyword, sector: General, Participatory M&E, maternal health
                                                 Reviewer’s Synopsis
The toolkit is a supplement to the UNFPA programming guidelines. It provides guidance and options for UNFPA
Country Office staff to improve planning, monitoring and evaluation (PM&E) activities in the context of results
based programme management. It is also useful for programme managers at headquarters and for national
programme managers and counterparts.The guide is very general, and provides only brief information on some
methods.
                                                       Content
1. Glossary of planning, monitoring and evaluation terms
2. Defining Evaluation
     o Evaluation and its relationship with monitoring.
3. Purposes of Evaluation
     o The purposes of evaluating and the use of the information.
4. Stakeholder participation in monitoring and evaluation
     o Participatory M&E, and stakeholders and their corresponding roles.
     o Describes situations relevant to participatory M&E.
     o Compare the responsibilities in participatory and non-participatory M&E.
     o Significance and different modalities of stakeholder participation in M&E
5. Planning and managing an evaluation
   Part I: Planning Evaluations
     o Overview of the ―evaluation analytical process‖ including preparation, implementation, analysis of
          information, conclusions, lessons learned and recommendations.
   Part II: Defining evaluation questions and measurement standards




                                                                                                               38
                                                                                                    M&E Toolkit

    o     Overview of factors to be taken into consideration when formulating evaluation questions regarding the
          validity of the design, the delivery process, program performance, relevance of the program, effectiveness,
          efficiency, sustainability, causality, unanticipated results, and possibility of alternative strategies.
     o Methodological challenges when measuring program relevance, effectiveness, efficiency and sustainability.
   Part III: The data collection process
     o Information regarding the determination of information needs and of data collection methods.
     o Introduction to quantitative and qualitative methods.
     o Brief overview of qualitative and quantitative analysis and of some data collection methods (rapid
          appraisal, observation, surveys, key informant interviews, focus groups, community interviews, and
          nominal group technique)
   Part IV: Managing the evaluation process
     o Discusses various aspects of managing the evaluation process: ―who‖ does ―what‖; steps in the
          development of a terms of reference and in the selection of an evaluator/evaluation team; and pointers on
          managing and supervising the conduct of an evaluation.
     o Provides overall guidance for a traditional approach to evaluation with limited stakeholder participation.
          However, the principles and management responsibilities mentioned in the tool should, with some
          adaptation, be applied to all types of evaluations.
   Part V. Communicating and using evaluations results
     o Suggests steps and considerations for the effective communication and use of evaluation results.
   Part VI. Evaluation Standards
     o Explains different standards to measure evaluations by.
6. Programme Indicators
   Part I: Identifying Output Indicators - The Basic Concepts
     o Explains the process of selecting indicators, including identifying the means of verification.
     o Information on establishing targets, expressing the indicators,
   Part II: Indicators for Reducing Maternal Mortality
     o Overview of maternal health programs, how to monitoring process, and information on indicators.

 (UNICEF 1991)
UNICEF. 1991. A UNICEF Guide for Monitoring and Evaluation.
Web Address (Address on document not working): http://www.preval.org/documentos/00473.pdf
Location:  Section A > UNICEF 1991
Keywords, sectors: General M&E
                                                 Reviewer’s Synopsis:
This manual explains monitoring and evaluation processes and emphasizes practical suggestions. The examples
used are from health services in UNICEF programming. The manual is focused in UNICEF programs. Overview of
M&E but, compared with the other manuals, this is not a complete tool. This manual is not on the UNICEF web
page as one of their listed tools. They do not present a more updated version either.
                                                        Content:
Section I: Introduction
     o Overview of M&E, definitions and relationship between monitoring and evaluation.
Section II: Organization of Monitoring and Evaluation
     o Information concerning UNICEF roles and responsibilities in M&E.
Section III: Strengthening Monitoring
     o Very general information concerning indicators, monitoring, participation, monitoring women and children
         situations.
Section IV: Strengthening Evaluation
     o Overview of the evaluation process: why evaluate, when to evaluate, what is the scope, who is responsible
         for each task, how is the data collected and resources, cost analysis, developing recommendations, using
         the evaluation results.
     o Overview of data sources for evaluations
Appendices: (mostly relevant for UNICEF): A. glossary; B. acronyms and abbreviations; C. indicators for some
sectors of UNICEF activity; D. checklists for evaluation manager and team; D*. Field trip report; E. Field trip
report; G. annual report form for listing evaluations and studies.
(* Two D appendix in manual)




                                                                                                                  39
                                                                                                  M&E Toolkit


                                                         W
 (WB-IPDET 2001)
World Bank. 2001. International Program for Development Evaluation Training. The Independent
Evaluation Group.
Web Location: http://www.worldbank.org/oed/ipdet/modules.html
Folder:  Section A >  WB-IPDET 2001> (Individual modules)
Keywords, sectors: General M&E
                                               Reviewer’s Synopsis:
This resource contains twelve modules from the two-week core course, which aims to build skills and knowledge
required for high-quality development evaluation. The course is designed especially for those with little prior
evaluation experience or for those wanting a refresher course.
                                                     Contents:
The following is the list of modules included in the course. As the modules state, each module is intended to stand
alone, and includes an instructional introduction, at least one case example, application exercises, references to
further reading and resources, powerpoint presentations from IPDET.
1. Introduction to Development Evaluation
2. Evaluation Models
3. New Development Evaluation Approaches
4. Evaluation Questions
5. Impact, Descriptive, and Normative Evaluation Designs
6. Data Collection Methods (Surveys, focus groups, interviews, diaries, checklists)
7. Sampling
8. Data Analysis and Interpretation
9. Presenting Results
10. Putting it all Together
11. Building a Performance-Based Monitoring and Evaluation System
12. Development Evaluation Issues

(WB 2004a)
The World Bank. 2004. Monitoring & Evaluation: Some tools, methods & approaches. 26p.
Web Location: http://www.worldbank.org/oed/ecd/me_tools_and_approaches.html (Downloadable in different
languages)
Location:  Section A > WB 2004a
Keywords, sectors: General M&E
                                                Reviewer’s Synopsis
Short booklet with overview of M&E methods. Very short explanations of the tools presented below including:
definition, use, advantage, disadvantages, and skill and time requirements.
                                                       Content:
M&E Overview
Performance Indicators
The Logical Framework Approach
Theory-Based Evaluation
Formal Surveys
Rapid Appraisal Methods
Participatory Methods
Public Expenditure Tracking Surveys
Cost-Benefit and Cost-Effectiveness Analysis
Impact Evaluation
Additional Resources on Monitoring and Evaluation

(WB 2004b)
Kusek, JZ and RC Rist. 2004. Ten steps to a results-based monitoring and evaluation system: A handbook
for development practitioners. The World Bank. 268p.
Web Location: http://www.oecd.org/dataoecd/23/27/35281194.pdf
Location:  Section A > WB 2004b
Keywords, sector: General M&E


                                                                                                                40
                                                                                                  M&E Toolkit

                                              Reviewer’s Synopsis:
This manual is targeted to officials with the challenge of managing for results. According to the preface, it can
stand alone as a guide on how to design and construct a results-based M&E system in the public sector. However, it
provides no concrete guidance concerning evaluation methods. Each section contains examples from the field for
each of the 10 steps.

                                                       Content:
Introduction: Building a Results-Based Monitoring and Evaluation System
     o The international initiatives pushing for the need of reform, including the Millennium Development Goals,
          among others.
     o Defines the concepts of Results-Based M&E, its key features and how results-based monitoring differs
          from implementation monitoring.
Chapter 1: Step 1: Conducting a Readiness Assessment
     o Rationale behind conducting this type of assessment (to check if there are the elements necessary for a
          results-based M&E system) and what key areas should be considered during the assessment.
Chapter 2: Step 2: Agreeing on Outcomes to Monitor and Evaluate
     o Selecting outcomes, and using a participatory approach in the process.
Chapter 3: Step 3: Selecting Key Performance Indicators to Monitor Outcomes
     o Information on selecting outcome indicators, using proxy indicators and pre-designed indicators.
Chapter 4: Step 4: Setting Baselines and Gathering Data on Indicators
     o Identifying baseline information and data sources to measure indicators.
     o Tables comparing data collection methods (review of program records, self-administered questionnaires,
     interview, and rating by trained observer) in terms of cost, training and time requirements, response rate.
Chapter 5: Step 5: Planning for Improvement—Selecting Results Targets
     o Information on how to set the desired performance target for the project.
Chapter 6: Step 6: Monitoring for Results
     o Illustrated the Gant chart as a management tool.
     o More in-depth and how-to information concerning resuls-based monitoring and links with implementation
          monitoring.
Chapter 7: Step 7: The "E" in M&E—Using Evaluation Information to Support a Results-Based Management
System
     o Outlines the different uses for evaluations and characteristics of a quality evaluation.
     o Describes the different types of evaluations: performance logic chain assessment, pre-implementation
          assessment, process implementation evaluation, rapid appraisal, case study, impact evaluation, and meta-
          evaluation.
Chapter 8: Step 8: Reporting the Findings
     o The uses of M&E findings, how to report them, and what to do if the information presents ―bad
          performance news‖.
Chapter 9: Step 9: Using the Findings
     o Guidance on using the M&E findings and sharing the information.
Chapter 10: Step 10: Sustaining the M&E System within the Organization
     o The six components for sustaining Results-Based M&E systems and how to evaluate the system.
Chapter 11: Making Results-Based M&E Work for You and Your Organization
     o Rationale behind a results-based M&E system and how to use the steps outlined above.
Annexes: I: Assessing Performance-Based Monitoring and Evaluation Capacity: An Assessment Survey for
Countries, Development Institutions, and Their Partners; II: Readiness Assessment: Toward Results-Based
Monitoring and Evaluation in Egypt; III: Millennium Development Goals (MDGs): List of Goals and Targets; IV:
National Evaluation Policy for Sri Lanka: Sri Lanka Evaluation Association (SLEva) jointly with the Ministry of
Policy Development and Implementation; V: Andhra Pradesh (India) Performance Accountability Act 2003: (Draft
Act) (APPAC Act of 2003); VI: Glossary: OECD Glossary of Key Terms in Evaluation and Results-Based
Management (2002).

(WKKF 1998)
W.K. Kellogg Foundation. 1998. WK Kellogg Foundation Evaluation Handbook.
Web Location: http://www.wkkf.org/pubs/Tools/Evaluation/Pub770.pdf
Location:  Section A > WFFF 1998


                                                                                                               41
                                                                                                   M&E Toolkit

Keywords, Sector: General M&E
                                                Reviewer’s Synopsis
Quoting the document: For project staff with evaluation experience, or for those inexperienced in evaluation but
with the time and resources to learn more, this handbook provides enough basic information to allow project staff to
conduct an evaluation without the assistance of an external evaluator. For those with little or no evaluation
experience, and without the time or resources to learn more, this handbook can help project staff to plan and
conduct an evaluation with the assistance of an external evaluator.
This handbook is not intended to serve as an exhaustive instructional guide for conducting evaluation. It provides a
framework for thinking about evaluation and outlines a blueprint for designing and conducting evaluations, either
independently or with the support of an external evaluator/consultant. For more detailed guidance on the technical
aspects of evaluation, you may wish to consult the sources recommended in the Bibliography section at the end of
the handbook.
                                                      Content
Part One: W.K. Kellogg Foundation’s Philosophy and Expectations
Chapter 1: Where We Are: Understanding the W.K. Kellogg Foundation’s Framework for Evaluation
     o Introduction to the foundation
Chapter 2: How We Got Here: A Summary of the Evaluation Landscape, History, Paradigms, and Balancing Acts
     o Theoretical background to evaluations and discussion of evaluation paradigms.
Chapter 3: Three Levels of Evaluation
     o Discussion of the three levels of evaluation developed by the foundation: Project-level, Cluster, and
          Programming and Policymaking Evaluations.
Part Two: Blueprint for Conducting Project-Level Evaluation
Chapter 4: Exploring the Three Components of Project-Level Evaluation: Context, implementation, and outcome
evaluation.
     o Definition, potential uses and examples for context, implementation and outcome evaluations.
     o Information on developing and implementing an outcome evaluation, with guidance questions to facilitate
          the process.
     o Discussion of the program logic model, including types (outcome model, activities model, theory model)
          and examples.
Chapter 5: Planning and Implementing Project-Level Evaluation
     o Guidance on the planning for an evaluation through the following steps: Identifying Stakeholders and
          Establishing an Evaluation Team; Developing Evaluation Questions; Budgeting for an Evaluation;
          Selecting an Evaluator.
     o Guidance on designing and conducting an evaluation through the following steps: Determining Data-
          Collection Methods; Collecting Data; Analyzing and Interpreting Data; Communicating Findings and
          Utilizing Results; communicating Findings and Insights; Utilizing the Process and Results of Evaluation.
     o Methods discussed: observation; interviewing; group interviews (nominal group/Delphi techniques);
          written questionnaires; tests and assessments (physiological health status measure, knowledge or
          achievement test) document review.




                                                                                                                 42
                                                                                                   M&E Toolkit



                                                  Section B:
                                     Non-Manual Resources

                                                 Introduction
The following section contains resources on general M&E, methods, and indicators. The resources contained in this
section come from a variety of sources, such as FANTA, the USAID Tips, and ALNAP, among others.

Like Section A, the resources are organized in alphabetical order. This section provides just a short review of the
resource, and the web and folder location, if available.

The matrix below provides a quick reference on the resources content, organized according to the main topics
encountered: General M&E, methods, and indicators.

                              Non-Manual Resources Review Matrix

Citation                                 Keywords / sectors
General M&E
Adrien 2003                               Training
AEA 2002                                  Evaluation (constraints)
Aubel 1999                                Participatory M&E
Bakyaita and Root 2005                    General (Malaria)
Bamberger et al 2003                      Evaluation (constraints)
Bergeron et al 2006a                      General
Bergeron et al 2006b                      General
Bonnard 2002                              Evaluation (Scope of Work)
Chapman and Wameyo 2001                   Evaluation (advocacy)
Cramb and Purcell 2001                    Participatory M&E (agriculture)
DAC 1999                                  Evaluation (humanitarian)
DAC 2000                                  Evaluation (Multi-donor)
Edbert and Hoechstetter                   Evaluation (Advocacy)
GNT 2002                                  Evaluation (Consultants0
HIV Alliance                              General (HIV/AIDS)
IDS                                       Participatory M&E
IFRC 2005                                 Evaluation (Framework)
Levinson et al 1999                       General M&E (Nutrition)
IFRC 2005                                 Evaluation (Framework)
Levinson et al 1999                       General M&E (Nutrition)
USAID Tips 1996a                          Participatory M&E
USAID Tips 1996c                          Evaluation (Scope of Work)
USAID Tips 1996g                          Monitoring (plan)
USAID Tips 1996h                          General M&E (Targets)
USAID Tips 1996i                          General M&E (Customer service Assessment)
Van de Putte 2001                         Evaluation (Humanitarian programs)
Indicators
Billing et al 1999                        Indicators (Water and Sanitation)
Booth and Lucas 2002                      Indicators (poverty)
Caplan and Jones 2002                     Indicators (Partnerships)
Mayoux d                                  Indicators



                                                                                                                43
                                                                                                M&E Toolkit

Riely et al 1999                        Indicators (food security)
Swindale and Bilinsky 2005              Indicators (Food Access)
Swindale and Ochi-Vachaspati 2005       Indicators (Food Consumption)
USAID Tips 1996f                        Indicators
Vermillion 2000                         General M&E (irrigation)
WB                                      Evaluation (designs)
WB 2002                                 Participatory M&E
WV 2003                                 Indicators (health related, poverty, participation)
Methods
Boyce and Neale 2006a                  Interviews
Boyce and Neale 2006a                  Mystery Clients
                                       Mapping, diagrams, calendars, ranking (preference, pairwise,
Cramb and Purcell 2001
                                     wealth and wellbeing), interviews (semi-structured and structured).
George                                Quantitative methods, in general
                                      Qualitative and quantitative methods, observation, key informant
                                     interviews, focus groups, surveys, direct measurements, secondary
Levinson et al 1999
                                     data
                                      Data analysis
Magnani 1997                          Sampling
Mayoux a                              Qualitative methods, in general
Mayoux b                              Participatory
Mayoux c                              Sampling
Neale et al 2006                      Case Study
Pisani 2003                           Population size estimation (HIV)
Ravallion 2002                        Statistics
USAID Tips 1996b                      Interviews
USAID Tips 1996d                      Direct Observation
USAID Tips 1996e                      Rapid Appraisal
USAID Tips 1996j                      Focus groups
Zeller 2004                           Surveys
WV 2003                               Focus Groups, document reviews, survey guidelines, analysis



                                       Review Section 

                                                      A
(Adrien 2003)
Adrien, M.H. February 2003.Guide to conducting reviews of organizations supplying M&E training. The
World Bank Operations Evaluation Department. ECD Working Paper Series, No. 9. 19p.
Web Location:
http://lnweb18.worldbank.org/oed/oeddoclib.nsf/0/3c997bfad76d6bb385256a780069cd4e/$FILE/ME_guide.pdf
Location:  Section B >  Training > ECD Training OrgRev
Keyword: Training
Review: Guide to review M&E training in developing countries.

(AEA 2002)
AEA Professional Development Session. November 5, 2002. Impact Evaluations When Time and Money are
Limited Lessons from International Development on the Design of rapid and economical, but
methodologically sound, impact evaluations. 42p.
Web Location: http://www.enterprise-impact.org.uk/pdf/IEWhenTimeandMoney.pdf
Location:  Section B >  General M&E > AEA 2002
Keywords: Evaluation
Review: Handouts discussing the following topics: The Growing Demand for Rapid and Economical Impact
Evaluations; Approaches to the design of low-cost impact evaluations; Rapid and Economical Methods for Impact


                                                                                                           44
                                                                                                        M&E Toolkit

Evaluations; Introduction to the theory and practice of impact evaluation design; Threats to the validity of
interpretations about program impacts; Shoestring Project Evaluation worksheet; Case Studies for Group Exercises:
Three Approaches to Evaluating the Gender Impacts of Micro-Credit Programs in Bangladesh.

(Aubel 1999)
Aubel, J. 1999. Participatory Program Evaluation Manual. Catholic Relief Services. 86p.
Web Location: Not Available
Location:  Section B >  General M&E > CRS PM&E
Keyword: Participatory M&E
Review: In-depth guidance on participatory program evaluation.

                                                     B
(Bakyaita and Root 2005)
Bakyaita, N and G. Root. 2005. Building Capacity in Monitoring and Evaluating Roll Back Malaria in
Africa: A conceptual framework for the roll back malaria partnership. The Roll Back Malaria Monitoring
and Evaluation Reference Group.
Web Location: Not available
Location:  Section B >  General M&E > RollBackMalaria M&E
Review: This document provides a backgroung on the epidemiology of Malaria in Africa and M&E guidance for
related programs.

 (Bamberger et al 2003)
Bamberger, M., J. Rugh, M. Church, and L. Fort. 2003. Shoestring Evaluation: Designing Impact
Evaluations under Budget, Time and Data Constraints. American Journal of Evaluation 25(1): 5-37. 33p.
Web Location: http://www.enterprise-impact.org.uk/pdf/ShoestringEvaluationAJE.pdf
Location:  Section B >  General M&E > AJE ShoestringEval
Keywords: Evaluation
Review: The paper discusses two common scenarios where evaluators must conduct impact evaluations when
working under budget, time or data constraints. In one scenario, the evaluator is not called in until the project is
already well advanced, and there is a tight deadline for completing the evaluation, frequently combined with a
limited budget and without access to baseline data. In the other scenario the evaluator is called in early, but it is not
possible to collect baseline data on a control group and sometimes not even on the project population.

(Bergeron et al 2006a)
Bergeron, G., M. Deitchlet, P. Bilinsky, and A. Swindale. February 2006. Monitoring and Evaluation
framework for title II development-oriented projects, USAID Technical Note 10. 4p.
Web Location: http://www.fantaproject.org/downloads/pdfs/TN10_MEFramework.pdf
Location:  Section B >  General M&E > USAID Title2
Keyword: General M&E
Review: 4-page brief article on M&E and its usefulness to program managers.

(Bergeron et al 2006b)
Bergeron, G., M. Deitchlet, P. Bilinsky, and A. Swindale. March 2006. Evaluating title II development-
oriented multi-year assistance projects, USAID Technical Note 11. 8p.
 Web Location: http://www.fantaproject.org/downloads/pdfs/TN11_MYAP.pdf
Location:  Section B >  General M&E > USAID MultiYr Projects
Keyword: Evaluation
Review: Explanation of the 4 types of evaluation designs

(Billing et al 1999)
Billing, P, D. Bendahmane and A. Swindale. 1999. Water and Sanitation Indicators Measurement Guide.
Food and Nutrition Technical Assistance (FANTA). 25p.
Web Location: http://www.fantaproject.org/downloads/pdfs/watsan.pdf
Location:  Section B >  indicators > FANTA WatSan
Keywords: Water and Sanitation; indicators
Review: Annual monitoring and impact indicators for watsan projects.


                                                                                                                     45
                                                                                               M&E Toolkit


(Bonnard 2002)
Bonnard, P. April 2002. Title II Evaluations Scopes of Work, Technical Note 2. Food and Nutrition
Technical Assistance (FANTA). 12p.
Web Location: http://www.fantaproject.org/downloads/pdfs/tn2SOW.pdf
Location:  Section B >  General M&E > FANTA SoW
Keyword: Evaluation
Review: Summarizes what a Scope of Work should contain.

(Booth and Lucas 2002)
Booth, D and H Lucas. July 2002. Good Practice in the Development of PRSP Indicators and Monitoring
Systems, ODI Working Paper 172. 73p.
Web Location: http://www.odi.org.uk/PPPG/publications/working_papers/172.pdf
Location:  Section B >  indicators > ODI PRSP
Keywords: Indicators
Review: Desk study of Poverty Reduction Strategy Papers (PRSP) for sub-Sahara Africa to assess good practice in
the developments of PRSP indicators and monitoring systems. Its findings include information on what, how and
why to monitor.

(Boyce and Neale 2006a)
Boyce, C. and P. Neale. May 2006. Conducting in-depth interviews: A Guide for Designing and Conducting
In-Depth Interviews for Evaluation Input. Pathfinder International. 16p.
Web location: http://www.pathfind.org/site/DocServer/m_e_tool_series_indepth_interviews.pdf?docID=6301
Location:  Section B >  Methods > Pathfinder Interviews
Keyword: method (interviews)
Review: Guidance on conducting an interview with a sample key stakeholder interview guide in the appendix.

(Boyce and Neale 2006b)
Boyce, C. and P. Neale. May 2006. Using Mystery Clients: A Guide to Using Mystery Clients for Evaluation
Input. Pathfinder International. 20p.
Web Location: http://www.pathfind.org/site/DocServer/m_e_tool_series_mystery_clients.pdf?docID=6303
Location:  Section B >  Methods > Pathfinder MystClients
Keyword: method (mystery client)
Review: Guidance on how to use mystery clients in a clinical setting to evaluate programs.

                                                       C
(Caplan and Jones 2002)
Caplan, K. and D. Jones. July 2002. Practitioner Note Series: Partnership Indicators, measuring the
effectiveness of multi-sector approaches to service provision. 6p.
Web Location: http://www.bpd-waterandsanitation.org/bpd/web/d/doc_33.pdf
Location:  Section B >  Indicators > Partnership Ind
Keyword: indicators
Review: Guidance for developing indicators to measure partnership effectiveness, drawn from the water and
sanitation sector.

(Chapman and Wameyo 2001)
Chapman, J. and A. Wameyo. January 2001. Monitoring and evaluating advocacy: a scoping study. 47p.
Web Location: http://www.preval.org/documentos/00545.pdf
Location:  Section B >  General M&E > ActionAid AdvocacyEval
Keywords: evaluation, advocacy
Review: The study is an ActionAid initiative (rights based approach to development). Details the challenges of
advocacy M&E and provides guidance and frameworks used by advocacy agencies to assess the impact of their
work.




                                                                                                             46
                                                                                                    M&E Toolkit


(Cramb and Purcell 2001)
Cramb, R and T Purcell. 2001. Developing forage technologies with smallholder farmers: How to monitor
and evaluate impacts. Australian Centre for International Agricultural Research, Impact assessment
program: Working paper series #41. 50p.
Web Location: http://www.aciar.gov.au/web.nsf/att/JFRN-6BN94V/$file/wp41.pdf
Location:  Section B >  General M&E > Agr PM&E
Keywords: Participatory M&E, methods
Review: Focused on agriculture. Participatory evaluation information, basic information on indicators and
developing an M&E plan, information on the following methods: mapping, diagrams, calendars, ranking
(preference, pairwise, wealth and wellbeing), interviews (semi-structured and structured). Includes examples and
illustrations.

                                                       D
(DAC 1999)
Development Assistance Committee (DAC) Network on Development Evaluation. 1999.Guidance on
evaluating humanitarian assistance in complex emergencies. 30p.
Web Location: http://www.oecd.org/dataoecd/41/57/35340909.pdf
Location:  Section B >  General M&E > DAC 1999
Keywords: Evaluation, Humanitarian
Review: Introduction to evaluating humanitarian and complex emergencies.

(DAC 2000)
Development Assistance Committee (DAC) Network on Development Evaluation. May 2000. Effective
practice in conducting a multi-donor evaluation. 28p.
Web Location: http://www.oecd.org/dataoecd/41/61/35340484.pdf
Location:  Section B >  General M&E > DAC 2000
Keyword: Evaluation
Review: Guidance for conducting a joint evaluation when there are partnerships in development programs, including
pros and cons, rationale and steps.

                                                             E
(Edbert and Hoechstetter)
Egbert, M and S. Hoechstetter. Mission Possible: Evaluating Advocacy Grants. Foundation News.
Web Location: http://www.foundationnews.org/CME/article.cfm?ID=3545
Location: Not available
Keywords: Evaluation, advocacy
Review: This brief article provides guidance on how to evaluate advocacy projects and how they differ from service
delivery projects. It provides links to purchase other related resources.

                                                           G
(George)
George, C. The Quantification of Impact. Enterprise Development Impact Assessment Information Service.
17p.
Web location: http://www.enterprise-impact.org.uk/pdf/QuantitativeMethods.pdf
Location:  Section B >  Methods > EDIAIS QuantitativeMeth
Keywords: Methods (quantitative)
Review: This paper looks at the role of impact quantification in the monitoring and evaluation process, from
baseline through to end-of-project impact assessment. It also looks at means of developing quantifiable indicators at
macro level as well as project level, and how quantification can be applied to the sustainable livelihoods and human
rights approaches, all as part of integrated impact assessments.

(GNT 2006)
Government of the Northwest Territories (Canada). 2006. How to work effectively with an evaluation
consultant. 8p.



                                                                                                                  47
                                                                                                     M&E Toolkit

Web Location: http://www.gov.nt.ca/FMBS/documents/dox/Howtoworkevaluationconsultant.pdf
Location:  Section B >  General M&E > GNT Consultant
Keyword: Evaluation
Review: Short guide with information from the initial meeting to evaluating the evaluation.

                                                        H
(HIV Alliance)
International HIV/AIDS Alliance. NGO Support Toolkit:
―Monitoring and Evaluation‖
Web Location: http://www.aidsalliance.org/sw17257.asp
Location: Not available
Keywords: HIV/AIDS programming, General M&E
Review: Short article with the rationale behind a M&E system for the NGO and its benefits.
―Monitoring‖
Web Location: http://www.aidsalliance.org/sw22757.asp
Location: not available
Keywords: HIV/AIDS programming, monitoring
Review: Short article with the usefulness of monitoring, technical support information, and issues to consider:
distinction between a monitoring system and meeting the donor’s reporting requirement.
―Evaluation‖
Web Location: http://www.aidsalliance.org/sw22759.asp
Location: not available
Keywords: HIV/AIDS programming, Evaluation
Review: Short article with the usefulness of evaluations, technical support information, topics for evaluation
training.

                                                       I
 (IDS 1998)
Institute of Development Studies (IDS). November 1998. Participatory Monitoring and evaluation: learning
from change” IDS Policy Briefing
Web Location: http://www.ids.ac.uk/ids/bookshop/briefs/Brief12.html
Location:  Section B >  General M&E > IDS PM&E
Keywords: Participatory M&E
Review: Article presents a comparison between conventional and participatory M&E and mention of the commonly
methods used.

(IFRC 2005)
IFRC. 2005. Operational Framework for Evaluations. Monitoring and Evaluation Department. 19p.
Web Location: http://www.ifrc.org/cgi/pdf_evaluation.pl?operational-framework-revised.pdf
Location:  Section B >  General M&E > IFRC 2005
Keywords, sector: Evaluation
Review: Presents the evaluation mandate of IFRC by clarifying principles, concepts, terms and processes of
evaluation as applied by the International Federation

                                                           L
(Levinson et al 1999)
Levinson, FJ, BL Rogers, KM Hicks, T Schaetzel, L Troy and C Young. 1999. Monitoring and Evaluation: A
Guidebook for Nutrition Project Managers in Developing Countries. International Food and Nutrition
Center, Tufts University School of Nutrition Science and Policy. 200p.
Web Location: http://www.idpas.org/pdf/2018MonitoringAndEval.pdf
Location:  Section B >  General M&E > Tufts NutritionM&E
Keywords: General M&E
Review: This guide is directed to nutrition projects, but contains information basic information to be use as guidance
in developing a monitoring plan, designing an evaluation, selecting indicators and data collection strategies




                                                                                                                   48
                                                                                                     M&E Toolkit

(including an overview of the following methods: qualitative and quantitative methods, observation, key informant
interviews, focus groups, surveys, direct measurements, secondary data), analyzing data and using the results.

                                                         M
(Magnani 1997)
Magnani, R. 1997. Sampling guide. Food and Nutrition Technical Assistance (FANTA). 52p.
Web Location: http://www.fantaproject.org/downloads/pdfs/sampling.pdf
Location:  Section B >  Methods > FANTA Sampling
Keyword: Methods (Sampling)
Review: This guide is designed to provide guidance on how to go about choosing samples of communities,
households, and/or individuals for such surveys in a manner that, when combined with appropriate indicators and
evaluation study designs, will permit valid conclusions to be drawn as to the effectiveness programs.

(Mayoux a)
Mayoux, L. Qualitative Methods. Enterprise Development Impact Assessment Information Service. 17p.
Web Location: http://www.enterprise-impact.org.uk/pdf/QualMethods.pdf
Location:  Section B >  Methods > EDIAIS Qualitative
Keyword: methods (qualitative)
Review: Overview of how qualitative methods complement quantitative and participatory methods. Provides a
review of the main methods: informal interviews, Case Studies, and direct observation.

(Mayoux b)
Mayoux, L. Participatory Methods. Enterprise Development Impact Assessment Information Service. 20p.
Web Location: http://www.enterprise-impact.org.uk/pdf/ParticMethods.pdf
Location:  Section B >  Methods > EDIAIS Participatory
Keyword: method (participation)
Review: Introduction to participatory methods, including challenges, advantages and techniques.

(Mayoux c)
Mayoux, L. Whom do we talk to? Issues in Sampling. Enterprise Development Impact Assessment
Information Service. 20p.
Web Location: http://www.enterprise-impact.org.uk/pdf/Sampling.pdf
Location:  Section B >  Methods > EDIAIS Sampling
Keyword: method (sampling)
Review: Information about sampling for each type of method used: statistical (quantitative), qualitative and
participatory. Defines the different types of sampling methods.

(Mayoux d)
Mayoux, L. What do we want to know? Selecting indicators. Enterprise Development Impact Assessment
Information Service. 24p.
Web Location: http://www.enterprise-impact.org.uk/pdf/SelectingIndicators.pdf
Location:  Section B >  Indicators > EDIAIS indicators
Keywords: indicators
Review: Provides definitions for the different types of indicators, including SPICED and SMART indicators, among
others. Guidance on indicators selection.

                                                        N
(Neale et al 2006)
Neale, P, S. Thapa and C. Boyce. May 2006. Preparing a Case Study: A Guide for Designing and Conducting
a Case Study for Evaluation Input. Pathfinder International. 16p.
Web Location: http://www.pathfind.org/site/DocServer/m_e_tool_series_case_study.pdf?docID=6302
Location:  Section B >  Methods > Pathfinder CaseStudy
Keyword: Methods (case study)
Review: Detailed guidance on conducting a case study, including a sample consent form.




                                                                                                                  49
                                                                                                        M&E Toolkit



                                                        P
(Pisani 2003)
Pisani, E. Updated 2003. Estimating the size of populations at risk for HIV, issues and methods.
UNAIDS/IMPACT/FHI 56p.
Web Location:
http://www.fhi.org/NR/rdonlyres/e66sj52tyha7m5dozchbbwptxepfqr47vmvjxvyo2dy7trd2ne5giddtvkksddwrpwxatd
gkprxwba/EstimatingSizePop.pdf
Location:  Section B >  Methods > HIV PopSize
Keywords: HIV, methods (population size)
Review: Covers the major methods available for population size estimation, with their strengths and weaknesses,
and gives examples. It also explores how best to choose the right method for a given country situation and sub-
population. The report, however, is not intended as a comprehensive guide to population size estimation: more
thorough tool kits will be needed and for some sub-populations may already be available.

                                                         R
(Ravallion 2002)
Ravallion, M. 2005. Evaluating Anti-Poverty Programs. World Bank. 90p.
Web Location: http://siteresources.worldbank.org/INTISPMA/Resources/383704-
1130267506458/Evaluating_Antipoverty_Programs.pdf
Location:  Section B >  Methods > WB StatsMethods
Keyword: evaluations, statistic methods
Review: The article reviews experimental and non-experimental statistical methods available to analyze programs,
including propensity score matching, discontinuity designs, double and triple differences and instrumental variables.

(Riely et al 1999)
Riely F., N. Mock, B. Cogill, L. Bailey, and E. Kenefick. 1999. Food Security Indicators and Framework for
Use in the Monitoring and Evaluation of Food Aid Programs. Food and Nutrition Technical Assistance
(FANTA). 50p.
Web Location: http://www.fantaproject.org/downloads/pdfs/fsindctr.pdf
Location:  Section B >  Indicator > Fanta FoodSec
Keywords: indicators, Food Security
Review: This guide outlines a process for identifying indicators and provides a conceptual framework for
understanding food security issues.

                                                            S
(Swindale and Bilinsky 2005)
Swindale, A. and P. Bilinsky. 2005. Household Dietary Diversity Score (HDDS) for Measurement of
Household Food Access: Indicator Guide. Washington, D.C.: Food and Nutrition Technical Assistance
Project, Academy for Educational Development. 12p.
Web Location: http://www.fantaproject.org/downloads/pdfs/HDDS_Mar05.pdf
Location:  Section B >  Indicators > FANTA FoodAccess
Keyword: Indicators, food security
Review: Mostly related to Title II projects, this article explains the indicators related to food security, in terms of
access and utilization, in the household.

(Swindale and Ohri-Vachaspati 2005)
Swindale, A. and P. Ohri-Vachaspati. 2005. Measuring Household Food Consumption: A Technical Guide.
Washington, D.C.: Food and Nutrition Technical Assistance (FANTA) Project, Academy for Educational
Development (AED). 93p.
Web Location: http://www.fantaproject.org/downloads/pdfs/foodcons.pdf
Location:  Section B >  Indicators > FANTA FoodConsumption
Keyword: Food Security
The guide describes the process and procedures for collecting the information to assess the food-intake



                                                                                                                      50
                                                                                                   M&E Toolkit

requirements of a household and a step-by-step analysis of the food consumed. The appendices provide detailed
information about analyzing the data.

                                                        U
(USAID Tips 1996a)
USAID Center for Development Information and Evaluation. 1996a. Conducting a participatory evaluation.
Performance Monitoring and Evaluation TIPS. 4p.
Web Location:
http://synkronweb.aidsalliance.org/graphics/NGO/documents/english/607_USAID_particip_evaluatino.pdf
Location:  Section B >  General M&E > USAID PM&E
Keyword: Participatory M&E
Review: The short document presents an introduction to conducting participatory evaluations, including step-by-step
guidance and a comparison with traditional evaluations.

(USAID Tips 1996b)
USAID Center for Development Information and Evaluation. 1996b. Conducting Key Informant Interviews.
Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs541.pdf
Location:  Section B >  Method > USAID Interviews
Keywords: Methods (Interviews)
Review: Short introduction to conducting interviews including advantages and disadvantages, and steps to follow.

(USAID Tips 1996c)
USAID Center for Development Information and Evaluation. 1996c. Preparing an Evaluation Scope of Work
(SoW). Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby207.pdf
Location:  Section B >  General M&E > USAID SoW
Keywords: Evaluation
Review: Description of the elements a good SoW should contain.

(USAID Tips 1996d)
USAID Center for Development Information and Evaluation. 1996. Using Direct Observation Techniques.
Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/ascii/pnaby208.txt (note: adobe link available, but not
accessible at the time: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby208.pdf)
Location:  Section B >  Methods > USAID Obs
Keywords: Methods (direct observation)
Review: Steps for conducting direct observations.

(USAID Tips 1996e)
USAID Center for Development Information and Evaluation. 1996. Using Rapid Appraisal Methods.
Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/ascii/pnaby209.txtt (note: adobe link available, but not
accessible at the time: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby209.pdf)
Location:  Section B >  Methods > USAID RapidAppr
Keywords: Methods (rapid appraisal)
Review: Short article on rapid appraisal methods with discussion on strengths and limitations, when they are
appropriate, and the methods commonly used during the process.

(USAID Tips 1996f)
USAID Center for Development Information and Evaluation. 1996. Selecting Performance Indicators.
Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby214.pdf
Location:  Section B >  Indicators > USAID Indicators
Keywords: indicators
Review: Guidance in selecting performance indicators.


                                                                                                                51
                                                                                                  M&E Toolkit



(USAID Tips 1996g)
USAID Center for Development Information and Evaluation. 1996. Preparing a Performance Monitoring
Plan. Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby215.pdf
Location:  Section B >  General M&E > USAID Monitoring
Keywords: Monitoring
Review: Short article introducing a performance monitoring plan, with step-by-step guidance.

(USAID Tips 1996h)
USAID Center for Development Information and Evaluation. 1996 Establishing Performance Targets.
Performance Monitoring and Evaluation TIPS. 5p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby226.pdf
Location:  Section B >  General M&E > USAID PerformanceTargets
Keywords: General M&E
Review: Short article introducing performance targets and benchmarks, and how to use them.

(USAID Tips 1996i)
USAID Center for Development Information and Evaluation. 1996 Conducting Customer Service
Assessments. Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby227.pdf
Location:  Section B >  General M&E > USAID CustService
Keywords: General M&E
Review: Introduces the concept of customer service assessments, how it relates to M&E and the steps in conducting
it.

(USAID Tips 1996j)
USAID Center for Development Information and Evaluation. 1996 Conducting Focus Group Interviews.
Performance Monitoring and Evaluation TIPS. 4p.
Web Location: http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby233.pdf
Location:  Section B >  Method > USAID FocusGrps
Keywords: Methods (focus groups)
Review: Presents the advantages and limitations, usefulness and steps for conducting a focus group.

                                                        V
(Van de Putte 2001)
Van de Putte, B. 2001. Follow-up to evaluations of humanitarian programmes. ALNAP (44p)
Web location: http://www.reliefweb.int/rw/lib.nsf/db900SID/LGEL-5JHMUQ/$FILE/alnap-follow-
apr01.pdf?OpenElement
Location:  Section B >  General M&E > ALNAP FollowUp to Evals
Keywords: Evaluations, humanitarian assistance
Review: Study to improve the way commissioning agencies use evaluations and follow-up on the findings and
recommendations (and how does this differs from the development context).

(Vermillion 2000)
Vermillion, DL. 2000. Guide to Monitoring and Evaluation of Irrigation Management Transfer. The
Japanesse Institute for irrigation and Drainage (JIID). 77p.
Web Location: Not Available
Location:  Section B >  General M&E > JIID Irrigation M&E
Keyword: General M&E
Review: Brief guide for designing and implementing M&E in irrigation management transfer programs.

                                                       W
(WB)



                                                                                                               52
                                                                                               M&E Toolkit

World Bank, Poverty Net. Evaluation designs.
Web Location:
http://web.worldbank.org/WBSITE/EXTERNAL/TOPICS/EXTPOVERTY/EXTISPMA/0,,contentMDK:20188242
~menuPK:412148~pagePK:148956~piPK:216618~theSitePK:384329,00.html
Location: Not Available
Keywords: Evaluation
Review: Explanation of the different evaluation designs: experimental, quasi-experimental and non-experimental
designs.

(WB 2002)
World Bank. October 2002. Sleeping on our own mats: An introductory guide to community-based
Monitoring and Evaluation. 55p.
Web Location: http://siteresources.worldbank.org/INTPCENG/214574-
1116505633693/20509339/communitybased.pdf
Location:  Section B >  General M&E > WB CommunityM&E
Keywords, sector: Participatory M&E
Review: Introduction to community-based M&E, based on a WB research on Africa projects.

(WV 2003)
World Vision. 2003. TDI (Transformative Development indicators) Field Guide. World Vision Development
Resources Team.
Web Location: http://www.transformational-
development.org/Ministry/TransDev2.nsf/subsection/83F75B520A2A847488256F40005C225C?editdocument
(downloads individual volumes in zip format)
Location: Vol. 2-5:  Section B >  Indicators >  WV indicators > (Each volume in zip file)
          Vol. 6-8:  Section B >  Method >  WV Methods > (Each volume in zip file)
Keywords: indicators, methods
Review: The field guide provides the technical basis for the measurement of the transformational development
indicators, including the methods for collecting, analyzing, and reporting on the indicators. A table of the
transformational development indicators is included in each volume:
     o Volume 1. Getting Started (Introduction to TDI, A Guide to Planning, Implementation, and Translation).
     o Volume 2. Well Being of Children and Families – Part I (A Guide to the Indicators on Nutrition,
         Immunization, and Education).
     o Volume 3. Well Being of Children and Families – Part II (A Guide to the Indicators on Water, Diarrhea
         Management, Household Resilience, and Poorest Households).
     o Volume 4. Transformed Relationships (A Guide to the Indicators on Caring for Others, Emergence of
         Hope, and Christian Impact).
     o Volume 5. Interdependent and Empowered Communities (A Guide to the Indicators on Community
         Participation and Social Sustainability).
     o Volume 6. Methods – Focus Group Guidelines & Document Review.
     o Volume 7. Methods – Survey Guidelines.
     o Volume 8. Analysis, Reporting and Learning

                                                       Z
(Zeller 2004)
Zeller, M. 6 February 2004. Review of Poverty Assessment Tools. Accelerated Microenterprise Advancement
Project, United States Agency for International Development. 53p.
Web Location:
http://www.povertytools.org/Project_Documents/Review%20of%20Poverty%20Assessment%20Tools.pdf
Location:  Section B >  Method > USAID PovertyAssSurveys
Keywords: methods (surveys)
Review: Article reviews the following tools: Benchmarks; Living Standard Measurement Surveys (LSMS); the
Social Dimensions of Adjustment Integrated Surveys (SDA-IS); Social Dimensions of Adjustment Priority Survey
(SDA-PS); Core Welfare Indicators Questionnaire (CWIQ); Demographic and Health Survey (DHS).




                                                                                                            53

								
To top