YEN Evaluation Clinic Nairobi by linzhengnd

VIEWS: 39 PAGES: 27

									YEN Evaluation Clinic: Nairobi
22-24 November 2010
Supported by the Jacobs Foundation, the Swedish International Development Agency and the Youth
Entrepreneurship Facility through the Danish led Africa Commission




                                                           Table of Contents              Page

                                                           Background                     3
                                                           Results of Working Groups      6
                                                           Conclusion                     15

                                                           Annex 1. Evaluation            17
                                                           Annex 2. Participant list      22
                                                           Annex 3. Agenda                25
                                                           Annex 4. List of Resources     27




                                                                                                 1
The complete set of pictures from the event can be found here


                                                                2
IMPORTANT: To gain access to the presentations and resources that accompany this report, request
to become a member of the YEN Clinic groupsite at http://yenclinic.groupsite.com. Links contained
within this report will only be accessible to registered member of the YEN Clinic groupsite. All Clinic
participants qualify as members.




A. Background and context
From 22-24 November 2010, the Youth Employment Network (YEN) hosted its second annual Evaluation
Clinic in Nairobi, Kenya. The Clinic provided a venue for evaluation specialists to collaborate with youth
entrepreneurship project teams for the development of impact evaluations plans. The 3-day event
included a series of interactive “consultation” sessions, lectures from renowned evaluation experts, and
the launch of a series of cutting edge impact evaluations. Collaborating with YEN on the delivery of the
Clinic included the International Initiative for Impact Evaluations (3ie), Innovation for Poverty Action
(IPA), the International Centre for Research on Woman (ICRW), Abdul Latif Jameel Poverty Action Lab
(JPAL) and the World Bank.

The theme for this year’s Clinic was “Youth Entrepreneurship”. The geographic area for targeting was
Sub-Saharan Africa. Recently, entrepreneurship programmes have been receiving increasing attention
from governments, donors and multilateral agencies as an alternative job creation strategy. Given the
limited absorptive capacities of existing formal labour markets in the developing world, promotion of
youth entrepreneurship and self-employment is one of the few feasible options to create employment
opportunities both in the informal and formal economy. Nevertheless, the evidence to support positive
impact of entrepreneurship schemes is extremely weak. The Clinic seeks to assist youth entrepreneur-
ship projects implement successful impact evaluations and disseminate the findings that come out.

YEN and impact evaluation

In the past two years, YEN has joined a host of players in the international community dedicated to
improving the evidence based of effective policies and programmes through the use of impact
evaluations. Impact evaluations provide unbiased, attributable evidence that a particular programme or
policy has “worked”, showing “why” and “how” an intervention has been successful by comparing
programme impacts against a counterfactual.

Unfortunately, the use of impact evaluation has been largely reserved for researchers, academics and
high level technical experts and their application and has not necessarily been applicable for
practitioners and programme people managing interventions. The reasons for this are understandable:
impact evaluations are costly, technically demanding and their application is limited to few interventions
with appropriate design features. This limits the usefulness of impact evaluation methodologies for
practitioners as well as the opportunity for results and lessons learned to be applied.



                                                                                                          3
Based on this assessment, YEN’s intervention in impact evaluations fulfils 2 distinct needs:


        1) The need to build the evidence base for effective programme design and delivery focused
        on the specific area of youth employment. This will be achieved by providing technical and
        financial support to qualified impact evaluations.

        2) The need to build the capacity of youth employment practitioners to conduct rigourous
        evaluations. This includes advising practitioners on evaluation designs that meet the needs and
        resources of their projects. The designs of most small and medium sized projects will not be
        conducive to an impact evaluation.


Fund for Evaluation in Youth Employment

The Clinic is delivered as part of the Fund for Evaluation in Youth Employment. The Fund provides
technical and financial support to youth entrepreneurship projects based on a competitive selection of
proposals (more information on the Fund is available here). In the first selection stage, interested
organizations replied to a call for proposals for impact evaluation plans. Four of the most qualified
proposals were then selected to a shortlist and invited to act as “live case studies” for the Clinic. In the
second selection stage, shortlisted proposals have been asked to submit full evaluation plans from which
the best ones will receive grants for conducting their impact evaluation.

Results of the Call for Proposals

Eighty-eight proposals were received across twenty countries, demonstrating the huge demand for
assistance on impact evaluations. Notably, thirteen proposals were received from Kenya, eighteen from
Uganda and nineteen from Nigeria. A majority of the proposals submitted offered to evaluate either
entrepreneurship training or microcredit projects.




                                                                                                          4
Results of the proposals also demonstrated the increasing linkages between the development and the
research communities – 20 academic institutions, 10 research firms and various research focused NGOs
international agencies were identified as research partners. Links to experienced, smart research firms
are essential to the production of quality impact evaluations.

Shortlisted projects (live case studies for Clinic)

The following projects received the highest scores for relevance of evaluation question, soundness of
identification strategy, the coherence of the evaluation question and outcome indicators and feasibility
of resources for the evaluation (including co-funding):

    1. Apprenticeship Training Programme and Entrepreneurial Support for Vulnerable Youth in
       Malawi (TEVETA and the World Bank).

    2. Evaluating the Impact of an Entrepreneurship Programme for Adolescent Girls in Tanzania
       "Empowerment and Livelihood for Adolescents" (BRAC and University College London).

    3. The Youth Venture Initiative: Harmonized Programmes for Economic Opportunities (Streetkids
       International and the Swiss Agency for Development).

4.      Financial Education and Mentorship for University Students and Youth in Kenya – Impact
Evaluation (Equity Bank and Kenyatta University).




                                                                                                          5
B. Results of Working Groups


Evaluating the Impact of an Entrepreneurship Programme for Adolescent Girls
Country: Tanzania
Implementing agency: BRAC
Research partners: University College London, London School of Economics, World Bank
Representatives: Imran Rasul (UCL) & Munshi Sulaiman (LSE/BRAC)
Evaluation Taskforce: Minna Mattero and Susana Puerto-Gonzalez (YEF/YEN)

Project description:

BRAC is an international development organization started in Bangledesh in 1972 and specializing in
poverty eradication. The BRAC intervention to be evaluated is known as the "Empowerment and
Livelihood for Adolescents" (ELA) programme. The objective of the ELA programme is to empower
adolescent girls both in terms of their entrepreneurial and business skills, and in terms of their social
empowerment. The aim is to build the girls capacities to lead a life of self-reliance and dignity, and
become active agents of social change in their own families and communities.

The ELA provides various services:

        1. Entrepreneurship: livelihood training activities that promote self-reliance; financial literacy
        training for better management of accounts.
        2. Social empowerment: life skills based education modules that provide basic training on
        health, reproductive health, risky behaviours, family and relationships and planning,
        3. Microfinance for small-scale enterprise to promote self-reliance.

In addition, the ELA is operated from a specific (rented) location in each village. This physical space is
known as the 'adolescent club'. The club provides a safe space for the girls to share their experiences,
enjoy reading and play games.

Evaluation design:

Evaluation questions: How effective is social empowerment and livelihood training in raising the well-
being of adolescent girls? How much additional impact does microfinance have?

Selection strategy: Randomized control trial at village level with multiple treatment groups.

Of the 150 villages that are eligible to receive the intervention, 100 villages are randomly chosen to be
treated with the first two components of the ELA programme, entrepreneurship and social



                                                                                                             6
empowerment training. Of the 100 villages that complete the trainings, a further random selection of 50
of the villages will be chosen to receive small loans 3-6 months after completion of the trainings.

Sample: With around 130 potentially eligible girls per village, there are 13,000 potentially eligible girls in
the surveyed villages. With an initial participation rate of 25% in treated villages, there is expected to be
3250 actual beneficiary girls of the programme in the short run.

A separate survey was also administered to parents of the participating girls to assess the expectations
and aspirations of parents and the effect on girls’ welfare.

Results of baseline survey: Across all 150 eligible villages, the baseline survey interviewed 5,148
adolescent girls (3,388 in treatment villages and 1,760 in controls) so there are around 30-40 surveyed
girls in each village. A similar sample size is planned to be collected at the follow-up survey, allowing for
additional costs of having to track girls and parents over time if they have changed location.

Detailed analysis of the baseline data has been conducted that shows that, on the vast majority of
observable characteristics, there are no significant differences between adolescent girls and their
households, between treatment and control villages.

Timeline:

    •   The baseline survey was conducted during Jul-Sep, 2009
    •   Program implementation started from Oct, 2009
    •   Microcredit services are being provided from Feb, 2010.
    •   Follow-up survey is planned to take place during Jul-Sep, 2011.
    •   Evaluation results will be out by early 2012


Recommendations from Taskforce:

    •   The evaluation should investigate methods for providing qualitative analysis. It was suggested to
        add games and simulations when interviewing participants.
    •   The evaluation should also find a strategy for looking at effects on the adolescent boy
        population.
    •   Can there be HIV/AIDS testing at the follow up survey?




                                                                                                                7
Financial Education and Mentorship for University
Students and Youth – Impact Evaluation
Country: Kenya
Implementing agency: Kenyatta University and Equity Bank Foundation
Research partners: Microfinance Opportunities
Representatives: Dr. Dinah Tumuti and Dr. Margaret Gecaga (Kenyatta)
Evaluation Taskforce: Drew Gardiner (YEN) and Veronica Chau (Dalberg)


Project description:

The project is targeted at 4,000 university students and 8,000 youths aged 18-35 years in 135
communities in Kenya. This project is implemented by Equity Bank Foundation in partnership with
Kenyatta University – the first time that the largest University and the largest microfinance bank in
Kenya will cooperate. The projects’ goal is to equip youth with knowledge and skills necessary for wealth
and job creation.
The project has three main components:


1. Training component (3 days): The training will cover seven financial education areas - budgeting,
   saving, debt management, financial negotiations, banking services, investment and insurance. It is
   hoped that youth will change their attitude and behavior towards financial matters in the long term.


2. Linking the trainees with the bank (6 months): participants who complete the training will be linked
   with mentors from Equity Bank. The trainees will be tracked using bank’s information management
   system to determine if their lives and livelihoods are changing as a result of the project intervention.
   Linking the trainees with the bank provides foundation for providing practical advice and
   information on the Bank’s operations and services


3. Financial services (6 months): Financial services will be made available to training graduates
   through local Equity Bank branches. Services could include microcredit loans, savings accounts,
   insurance or investment products. The type of financial service will be decided based on the
   applicants credit worthiness of financial experience.


The project will use Kenyatta University Students as trainers to train other students and youth selected
from pilot areas. This will be extension of existing Student Voluntary Community Service Initiative
implemented by Kenyatta University and Equity Bank. Under this initiative students commits
themselves to offer community service for two weeks during vacations.




                                                                                                           8
Evaluation design:

Research question: Does the combination of peer-to-peer financial education, mentoring and financial
services contribute to behavior change which improves youth’s financial security?

Identification strategy:

-Stage 1: Before and after analysis of participants in the community.

The design of the project in its current form is not amenable of an impact evaluation. The short training
period (3 days) is not considered long enough to produce a net impact on beneficiaries. Furthermore,
the selection of participants has already begun without a baseline survey.

It is recommended that instead of an impact evaluation, the project is evaluated using a “before and
after” survey. A short entry survey of program participants would be conducted on the first day of
training and these results would be compared to an exit survey completed at the end of the 6 months of
mentorship and financial services. While this exercise would not produce “net” impact findings, it would
provide an opportunity for the project team to learn basic monitoring techniques such as survey design
and analysis.

The before and after survey should be complemented by a process evaluation which would investigate
how the key design features of the programme can be improved.

Sample: All 8,250 student participants take entry and exit survey.

-Stage 2: Randomized control trial with double randomization

Once key design features have been revised and improved, the programme team can determine
whether a RCT is appropriate. If so, the best tactic to avoid selection bias is to randomize at the level of
the trainers and the participants.

Comments from Taskforce:
Much of the comments focused on the design of the project rather than the impact evaluation:

    •   What is the role of Equity Bank Foundation? Discussions should be started with Equity to
        identify appropriate mentors and review eligibility for youth to attain financial products. Equity
        should avoid pushing products on youth that are not credit worthy.
    •   Three days of training is not enough time to provide participants with financial literacy. Training
        time should be extended to 5 – 10 days.
    •   The project should start with developing a rigorous monitoring plan which includes a SMART
        results chain, a monitoring scorecard, regular surveying of programme participants and


                                                                                                               9
monitoring training of programme staff. Follow up from the Taskforce will concentrate on these
elements.




                                                                                            10
Youth Venture Initiative
Country: Ethiopia
Implementing agency: Streetkids International
Research partners: Swiss Academy for Development
Representatives: Kristy Vanderplas (SKI) & Katherina Wespi (SAD)
Evaluation Taskforce: Nathan Fiala (World Bank) & Constanze Lullies (Jacobs Foundation)

Project description

Streetkids International is a non-profit agency whose mission is equip street children with business skills
and health knowledge. Their multi-pronged youth livelihood development program trains youth workers
how to deliver the Street Business and Street Banking training programmes. The training is delivered
through SKI’s local implementing partner, Emmanuel Development Association.


The training programs provide youth workers with the skills and tools needed to facilitate entrepreneur-
ship training (ToTs) and the foundation of savings, loan and credit management to marginalised youth
with low literacy and numeracy skills. The training is grounded in Street Kid’s internationally recognized
Street Business and Street Banking Toolkits. The Street Business Toolkit imparts basic financial education
for youth and leads youth through the steps required to create a business plan. The Street Banking
toolkit complements the business training program and focuses on how youth can access resources to
launch their business.


During the evaluation period 1 ToT and 6 Street Business and Banking trainings are scheduled. The
Street Business and Street banking trainings will be facilitated by youth workers who have completed
the initial ToT training. 12 youth workers and 24 youth will participate in the ToT and 15-20 youth will
participate in the scheduled Street Business and Street Banking training. Each training session has an
equal number of males and females. Youth who successfully complete the training will be eligible to
receive a business grant and access savings products.


Evaluation design


Evaluation questions: Do the trainings improve the livelihood conditions of youth? Do youth maintain,
improve or start up a business as a result of training? Do youth have improved pyscho-social well-being
as a result of training? Heterogeneity: understanding the effect of gender, literacy, prior business
experience, age, family situation, abuse, rural/urban


Identification strategy:

Phase 1 – Development of rigorous monitoring plan for SKI operations



                                                                                                           11
SKI’s delivery model is to work through a network of implementing partners who are responsible for
delivering Streetkids training to disadvantaged children and youth. This reliance on implementing
partners makes results measurement a difficult task as the local partners do not always have the
experience or priority to closely monitor results. The indicators that are monitored are often different
across SKIs various projects.

It was decided that an impact evaluation would be postponed to a second phase and rather a rigorous
monitoring plan would be developed. A rigorous monitoring framework would include the following
elements:

    •   A results chain with consistent outcome indicators across all SKI projects
    •   A data collection strategy detailing what, how and when data will be collected
    •   A monitoring scorecard to report on the achievement of outcome targets
    •   Fegular surveying of programme participants before and after trainings
    •   Training of local partners on how to use the monitoring framework

Phase 2 – Randomized control trial with multiple treatment groups

In year 2, once the monitoring plan has been developed and implemented, SKI will conduct a
randomized control trial using multiple treatment groups. Randomization will be done at two levels:
amongst SKIs implementing partners and amongst street kid programme participants. This double
randomization will allow SKI to test both the performance of its implementing partners and the impact
of the programme on beneficiaries.

In the interest of cost-effectiveness, SKI’s implementing partners will be trained by the Swiss Academy
of Development to conduct the household surveys. Primary data will be complemented by already
existing data from interviews, focus group discussions and most significant change technique. The
evaluation will also take into account secondary country data such as Labour Force Surveys,
Demographic and Health Surveys (DHS), Living Standard Measurement Surveys (LSMS), censuses,
administrative records and official national estimates.

Sample size: 100 implementing organizations (50 control, 50 treatment); 1800 participants (600
treatment 1, 600 treatment 2, 600 control).

Comments from taskforce:


    •   Evaluation should not only look at measuring outcomes, but should also seek to unlock the
        project’s “blackbox”, what is the projects theory of change and how can indicators be defined to
        add context to the evaluation
    •   Be aware of control group contamination such as:


                                                                                                           12
    •   Risks of contamination by other social programmes with similar target groups (be aware of
        where these risks are)
    •   Risk of spillover from contact between treatment and control
•   Sample size: will need to see a power calculation given the small sample size
•   Importance of primary data collection: training of enumerators, testing questionnaire, ensuring
    quality data collection will be backbone of successful evaluation.
•   Mobility of youth: need good tracking mechanism to track upwardly mobile street children
•   How long will impact take? A good results chain indicates when expected impact will be realized
    and follow up survey planned accordingly




                                                                                                 13
Apprenticeship Training Program and Entrepreneurial
Support for Vulnerable Youth

Country: Malawi
Implementing agency: Technical, Entrepreneurial and Vocational Education and Training Authority
(TEVET)
Research partners: World Bank & the National AIDS Commission
Representatives: Victor Orozco (World Bank), Sylvan Herskowitz (World Bank) & Malumbo Gondwe
(National AIDS Commission).
Evaluation Taskforce: Markus Pilgrim (YEN) & Vera Chiodi (J-PAL)

Project description

TEVETA was created by government decree in 1999 as Malawi’s apex body for delivering technical and
vocational training. The objective of this intervention is to provide vulnerable youth in Malawi, including
orphans and out-of-school female youth, an opportunity to enhance their employability and earning
potential, thus reducing high risk behavior that increases vulnerability to HIV infection. This intervention
is motivated by the following situation faced by Malawian youth, notably for young women: low skill
levels measured by educational outcomes coupled with high rates of unemployment, extreme poverty,
high risk sexual behavior (early average age at first intercourse, inter-generational sex, transactional sex,
and multiple concurrent partnerships), and high HIV prevalence. By providing youth with training and
microcredit opportunities (in the form of business start-up kits), they can improve their employability
and earning potential. Newly skilled Malawian youth will be able to increase their income, thus reducing
high risk behavior and their exposure to HIV/AIDS, especially among young women.


Evaluation design

Evaluation question: Will skills training and start-up kits for vulnerable youth create better economic
outcomes for the targeted population? Does the intervention (apprenticeship training program +
microcredit/start-up kit) for vulnerable youth reduce vulnerability to health risks and risky
behavior?

Selection method: Randomized control trial with phase in approach

The vocational program will be evaluated through an experimental design that randomly assigns eligible
communities to two phases of program roll-out. Under this design, communities are randomly phased
into the program every six months, with those communities assigned to later periods serving as control
groups for the communities who receive the intervention earlier in the program. Thus, given the
randomization at the community level, the average characteristics of the treated and control groups
should be similar. Econometric work will further improve the estimation of the program’s effects by
controlling for differences in exogenous initial conditions (i.e., education, social capital) and allowing for
interaction effects.


                                                                                                            14
Sample size: 23 out of 30 eligible districts and 1,500 out of 1,900 eligible participants will participate in
the survey. A local survey firm with substantial field experience, Innovative Knowledge Initiative, was
hired to conduct the baseline survey (over 1,100 beneficiaries were surveyed). Pending on funding
availability, two follow-up surveys will be conducted and complemented with qualitative evidence.
Participating mentors and beneficiaries will provide inputs in rounds of open-ended surveys in order to
better understand the link between the intervention and its expected outcomes.

Timeline:

         •   Nov-Dec 2009: Participant and Master Craftsmen identification
         •   April 2010: Baseline
         •   June-July 2010: Master Craftsmen training
         •   August-Dec: Cohort 1 Training
         •   Dec-Jan 2010/11: Trainer Questionnaire
         •   May 2011: Midline
         •   June 2011: Cohort 2 Training
         •   April 2012: Endline


Recommendation from taskforce

        •    Is it possible to add a cross cutting design - multiple control groups?
        •    How will information spillovers be controlled for? Heterogeneity of impact?
        •    Originally, the program targeted at an economic and health impact (reduced risky behaviour
             like drug use or unwanted pregnancy), after some discussions it was agreed to drop the
             health impact as some experts questioned whether there was a causal link between a 2 to 3
             month apprenticeship training complemented by a one-week life skills training and
             improved health outcomes.
        •    The discussion on the budget showed that data collection is the most expensive budget item
             (two thirds of the total evaluation budget of USD 300,000 will be spent for running the
             surveys). As the program is run on a national basis the cost of tracking and visiting
             participants is relatively high (USD 75 per person interviewed). Are there measures to
             reduce data collection costs?




                                                                                                          15
C. Conclusions and follow-up
After participating in the Clinic, the four impact evaluations are invited to submit full Evaluation Plans to
YEN by 28 February 2011. A template has been provided for guidance of what is expected in the
Evaluation Plan. The shortlisted evaluations are also requested to submit drafts of questionnaires, ToRs,
communication material and baseline, mid-term and final evaluation reports as they become available.
By submitting these documents, the evaluation teams agree that they can be shared publicly.


Once the Plan is submitted, it will be commented on by the Evaluation Taskforce and a decision will be
taken on how much funding to provide. In total, $310,000 in funding is sought amongst the 4 impact
evaluations while only $180,000 is available.

In addition to financial support, technical support will continue to be provided by the Evaluation
Taskforce. The support will include comments on evaluation plan, the possibility of a site visit and advice
at key stages of evaluation implementation such as selection of a data collection firm, questionnaires
and selection process.

At the suggestion of the Evaluation Taskforce, the Streetkids International, Youth Venture Initiative, will
submit a rigorous monitoring plan to be considered for funding instead of an impact evaluation plan.
The prior development of a rigorous monitoring plan will improve the likelihood of a successful impact
evaluation. A quality monitoring plan should include a thorough results chain, a system for collecting
and analyzing data and appropriate measures for monitoring and assessing indicators. In the future, YEN
will look to provide support through the Evaluation Clinic and Fund for the development of monitoring
frameworks of its partners.

The second call for proposals for the Fund for Evaluation in Youth Employment will be issued in the
second quarter of 2011. The thematic and regional concentration of the next call is yet to be
determined.




                                                                                                           16
  Annex 1: Participants’ Evaluation

  1- Distribution of participants by professional background

   Project staff of a live case study discussed during the clinic
                                                                                  8
   From an NGO
                                                                                 14
   From a government agency
                                                                                  4
   From a multilateral organization
                                                                                 10
   From a consultant private firm
                                                                                  1
   Resource person
                                                                                  5
   NA
                                                                                  1
   Total
                                                                                 43


  2- Clinic effectiveness

                                                     Poor       Fair    Good    Excellent     NA     Total

     1. Achievement of stated objectives              0%        9%       37%      49%         5%     100%

     2. Quality of planning results                   0%        9%       37%      44%         9%     100%

     3. Design and organization of the Clinic         0%        7%       40%      51%         2%     100%

     4. Methodology applied                           0%        12%      35%      51%         2%     100%

     5. Your overall rating of the Clinic                                                            100%



  3- Resource People Effectiveness

                                                         Poor          Fair    Good     Excellent   NA       Total

1. State of preparation                                    0%          5%      40%          49%     7%   100%

2. Resource people knowledge of topics                     0%          5%      23%          65%     7%   100%



                                                                                                             17
3. Resource people ability in presenting lectures       0%          7%        47%       42%       5%       100%

4. Interaction within the group                         2%          5%        40%       49%       5%       100%

5. Your overall rating of the resource people           0%          5%        47%       44%       5%       100%



  4- Clinic Administration

                                            Poor     Fair    Good        Excellent   NA       Total

  1. Preparation of the Clinic                  0%   7%       28%          63%       2%       100%

  2. Suitability of the training venue          2%   5%       14%          77%       2%       100%

  3. Suitability of the time schedule           2%   5%       23%          65%       5%       100%

  4. Quality of catering                        2%   0%       26%          70%       2%       100%



  5- Overall comment on the Clinic

      •   Excellent learning opportunity for a NGO like mine that is in the middle of evaluating our work
          and looking to take the next steps. Also good networking with other organizations.
      •   Great to have some live cases and apply method of impact evaluation. But would also have
          been useful to spend a bit more time on the other 3 projects.
      •   The clinic was well organized although transport for non borders could have been provided
          from and to the Hotels in Nairobi where they were staying.
      •   I think it should have taken a shorter time period like 1 1/2 days. Impact could have been
          achieved in this time. Cases used should have been disbursed to participants before clinic in
          order to understand and be more critical.
      •   Very educative- the delivery is simplified and accommodative to different levels of education,
          and experiences.
      •   The time allocated for entire clinic was not enough. Especially for those starting/unfamiliar
          with topics. Is it possible to add an hour, like start an hour earlier?
      •   I liked the variation field visit. The dinner was a good networking opportunity
      •   A bit too much of methodogical inputs which were too much basics/more discussion of how to
          implement designs and obstacles facing when implementing would have been great
      •   Very constructive undertaking. May be useful even to expand/extend to allow for more rigour.
      •   The clinic was excellent. Could improve on: End time too late for Nairobi residents. Please plan
          to end no later than 4.30 pm each day; The site visit was misplaced or not explained.


                                                                                                            18
      •   Personally very useful in better understanding Impact Evaluations with programmes about to
          start, less so for post-programmes (still useful however). Would have liked maybe a discussion
          on questionnaires and formulating relevant questions.
      •   Good and added value to my career as someone who oversees evaluations. I enjoyed the
          speed geeking and the network that I have built along the way.
      •   The clinic was well organized although the time was limited especially for those with no or
          little knowledge on IE
      •   The clinic was an overall success as it gave me a good foundation in Randomization which I did
          not previously have.
      •   The clinic was informative and challenged the conventional way of accessing/measuring the
          results in youth employment program. Practical & easy to apply though more inputs are
          required in the same to practioners.
      •   Great learning! Resource people & others have been willing to share their knowledge without
          limitations. It's a learning experience and I am very grateful you invited me.
      •   Excellent format, but perhaps a bit high cost, would be great to find a way to scale back on
          cost so these could be run more frequently and in more places. Drew and Susanna did a
          marvellous job organizing the event.
      •   I thought the combination of technical presentation & working groups was a good balance.
          Some of the technical presentater, however, were slightly hard to follow in beginning.
      •   Nice and varied delivery of complex information, nicely interactive. Really good job in
          recruiting quality participants.
      •   It was well planned but final output of the evaluation plans for selected projects not very good
          esp. Outcome section.
      •   Learnt a lot. Wished there was a little more time to some basic workshop of outputs,
          outcomes, and respective indicators. Very useful, very educative, very participatory


6- Would you recommend this clinic to a colleague or peer policy maker or practitioner working on
youth employment and entrepreneurship issues?

Yes        No      Maybe       Total

98%        0%        2%        100%



7- Would your organization be willing to host or partner with YEN in the implementation of an
evaluation clinic in 2011?

Yes        No      Maybe        N/A     Total




                                                                                                           19
70%       20%        5%         5%      100%



8- What suggestions do you have for improving the clinic program?

      •   Would like more time to present Educate!'s M&E strategies in order to get advice from
          experts and other participants. Maybe this time could be taken from live consultations
          because some groups finished early.
      •   Keep up the spirit
      •   Field visit could have been more fruitful and related to the clinic such as visiting an
          entrepreneurship project rather than a vocational training centre
      •   More live cases less theory
      •   Sharing of information especially success stories, sharing of evaluation materials
      •   Include more live case studies and allow more lengthy presentations by resource persons.
      •   Have "energiser" activities in between presentations
      •   Making different groups according to knowledge level for some parts, life case studies should
          have (more) time to consult with each other
      •   More case studies be included especially for participants to discuss; better furniture (chairs)
      •   Have an implementer present for each live case; be more ... On whether you'll fund the
          Evaluations in the live cases.
      •   More youth involvement and participation/especially ones who can't access funding but are
          doing something to improve their status.
      •   Time for the break away live consultations was in same cases limited
      •   Where possible, go through proposal and guide applicants on how to improve them.
      •   More solid background/introduction on the theory of change &Randomization before
          beginning the case study.
      •   ... Of the methodology to be discussed and accompanied by live cases.
      •   Although it would have been hard to fit in time wise, one or two short "case study" practical
          exercises (IE not live) might have been helpful. The one that was done was excellent; it would
          have been useful to have a similar one to practice (for example)identification strategy and/or
          budgeting.
      •   Try to incorporate youth into being involved in the M&E as this is a big empowering forum. Try
          sponsor people from out of the country the clinic is held. THANK YOU SO MUCH!
      •   Would have been better if we could have interviewed the finalists prior to making final
          decision to invite them to ensure appropriateness of project for a "rigorous" impact eval.
      •   Extend IE Methods section. Offer short sample size component
      •   A more well planned site visit. The 1st half was good, however, the wandering around SOS


                                                                                                            20
    Technical College was not so good - although a great opportunity to see something besides
    the hotel.
•   Some sessions a little dry and repetitive, though that may have been necessary for
    comprehension. Very little sector specific info. I don't know much about youth
    entrepreneurship and would’ve liked an overview.
•   Afternoon working groups = low energy, possibly some re-arrangement here but otherwise it
    was great! Thanks for all your support & hard work!




                                                                                                21
Annex 2: List of Participants, Evaluation Clinic Nairobi, 22 - 24 November, 2010

Prefix   First Name    Last Name       Job Title                   Organization                         Country         E-mail address
Mr.      Cecil         Agutu           Programmes Manager          Informal Sector Business Institute   Kenya           agutuc@yahoo.co.uk
                                                                   - Strathmore Educational Trust
Miss     Josephine     Atieno Otieno   Monitoring and Evaluation   KEPSA                                Kenya           kyep-monitoring@kepsa.or.ke
                                       Officer
Mr.      Munga         Boaz            Assistant Analyst           KIPPRA                               Kenya           bmunga@kippra.or.ke
Mr       Paul          Bundick         Director                    FIELD Support Leader with            Kenya           pbundick@aed.org
                                                                   Associates
Ms.      Veronica      Chau            Associate Partner           Dalberg Associates                   United States   Veronica.Chau@dalberg.com
Ms.      Vera          Chiodi          Post-doctoral Fellow        J-PAL Europe                         France          vchiodi@yahoo.fr
Ms.      Miriam        Christensen     Associate, Communications   Youth Entrepreneurship Facility,     Tanzania        christensenm@ilo.org
                                                                   ILO
Ms.      Margaret      Dechada         Professor                   Kenyatta University                  Kenya           director-coep@ku.ac.ke
Mr.      Jens          Dyring          Chief Technical Adviser     Youth Entrepreneurship Facility,     Tanzania        dyring@ilo.org
                       Christensen                                 ILO
Mr.      Nathan        Fiala           Evaluation Expert           World Bank                           United States   nvfiala@gmail.com
Mr.      Drew          Gardiner        Technical Officer           Youth Employment Network             Switzerland     gardiner@ilo.org
Mr.      Julius        Gichohi         Program manager             Entwise Associates                   Kenya           gichohijg@gmail.com
                                       BDS/Reseach and
                                       Evaluation
Mrs.     Mugethi       Gitau           Business Challenge          Youth Initiatives - Kenya (YIKE)     Kenya           mugethi@yike.org
                                       Coordinator
Mr.      Eric          Glustrom        Director                    Educate!                             Uganda          eric@experienceeducate.org
Mr.      Malumbo       Gondwe          M&E Officer                 National AIDS Commission             Malawi          gondwem@aidsmalawi.org.mw
Ms.      Kamilla       Gumede          Executive Director          J-PAL Africa                         South Africa    kamilla.gumede@gmail.com
Mrs.     Linda         Helgesson       Evaluation Specialist       Femina                               Tanzania        lindahelgesson@hotmail.com
Mr.      Sylvan        Herskowitz      Field Coordinator           World Bank                           Malawi          sherskowitz@gmail.com
Ms.      Astrid        Hurley          Associate Social Affairs    UN Programme on Youth                United States   hurleya@un.org
                                       Officer
Mr.      Ibrahim       Hussein         Deputy Chief of Party       G-Youth Program EDC                  Kenya           ihussein@edc.org
Ms.      Caroline      Riungu          M&E Manager                 G-Youth Program EDC                  Kenya           criungu@edc.org
Mrs.     Linda         Ihutha          Director                    Catapult Enterprises Ltd             Kenya           lihuthia@catapult.or.ke
Mr.      Oliver        Kirimi          Chief Executive Officer     The Village Trust                    Kenya           ceo@villagefunds.org
Ms.      Constanze     Lullies         Program Officer             Jacobs Foundation                    Switzerland     constanze.lullies@jacobsfoundation.org
                                       Intervention and
                                       Application

                                                                                                                                                   22
Mr     Eliam         Mahohoma   National Project             International Labour                 Zimbabwe        mahohoma@ilo.org
                                Coordinator                  Organization, YES - JUMP
Mrs.   Jane          Maigua     National Project             International Labour Organization    Kenya           maigua@ilo.org
                                Coordinator
Mr.    Martin        Matiro     Trainer - Entrepreneurship   Informal Sector Business Institute   Kenya           mmatiro@eitkenya.org
                                                             - Strathmore Educational Trust

Ms.    Minna         Mattero    Regional Youth-to-Youth      Youth Employment Network /           Tanzania        mattero@ilo.org
                                Fund Coordinator             Youth Entrepreneurship Facility
Mr.    Robert        Mawanda    Country Coordinator          Youth Entrepreneurship Facility,     Uganda          mawanda@ilo.org
                                                             ILO
Mr.    Julius        Mburugu    Principal Consultant         Entwise Associates                   Kenya           juliusmburugu@yahoo.com
Ms.    Lynnette      Micheni    Programme Assistant          KCDF                                 Kenya           lynnette.micheni@kcdf.or.ke
Mr.    Ehud          Mukuha     Manager                      Eastleigh Community Center           Kenya           pceaecc@africaonline.co.ke
Ms.    Mwongeli      Muthuku    Program Assistant            Youth Entrepreneurship Facility,     Kenya           mwongeli@ilo.org
                                                             ILO
Mr.    Julius        Mutio      Entrepreneurship Expert      Youth Entrepreneurship Facility,     Kenya           mutio@ilo.org
                                                             ILO
Mr     Tapera        Muzira     Chief Technical Adviser      International Labour Organization    Zimbabwe        muzira@ilo.org

Mr.    Paul          Ndungu     Project Manager              Technoserve Kenya                    Kenya           pndungu@technoserve.or.ke
Miss   Julie         Ndwiga     Consultant                   Progress Management                  Kenya           ndwigajulie@yahoo.com
                                                             Consultants
Ms.    Rose          Ngara-     Policy Analyst               KIPPRA                               Kenya           rmuraya@kippra.or.ke
                     Muraya
Mr.    Gerald        Ng'ong'a   Coordinator                  Rafiki social Development            Tanzania        rafikishy@yahoo.com
                                                             Organization
Mr.    Abdikadir     Nur        Director of operations       Intercontinental Charity             Kenya           ico.org.com@gmail.com
                                                             Organization
Dr.    Gorretty      Ofafa      Lecturer                     Kenyatta University                  Kenya           gaofafa@yahoo.com
Mrs.   Mary          Ogola      Senior Lecturer              Kenya Technical Teachers College     Kenya           mo_ganga@yahoo.com

Dr.    Bell          Okello     Gender, Agric and Rural      International Center for Research    Kenya           bokello@icrw.org
                                Development Specialist       on Women (ICRW)
Mr.    Christopher   Okidi      Programme Development        Macho Youth Leadership               Uganda          chrisokidi1986@gmail.com
                                Advisor                      Development Foundation
Mr.    Victor        Orozco     Coordinator – Africa         World Bank                           United States   vorozco@worldbank.org
                                HIV/AIDS Impact Evaluation
                                Program

                                                                                                                                             23
Mr.    Markus         Pilgrim       Manager                      Youth Employment Network            Switzerland   pilgrim@ilo.org
Mr.    Bainomugisha   Pius          Executive Director           Kyasira Home of Hope, Uganda        Uganda        bainomugishapius@gmail.com
Ms.    Olga Susana    Puerto        Evaluation Specialist        Youth Employment Network /          Switzerland   puerto-gonzalez@ilo.org
                      Gonzalez                                   Youth Entrepreneurship Facility
Ms.    Sonia          Rasugu        Youth Programme              KCDF                                Kenya         sonia.rasugu@kcdf.or.ke
                                    Coordinator
Mr.    Imran          Rasul         Professor                    University College London           United        i.rasul@ucl.ac.uk
                                                                                                     Kingdom
Ms.    Virginia       Rose Losada   Junior Technical Officer -   International Labour Organization   Switzerland   g1ifp_seed@ilo.org
                                    Youth Entrepreneurship
Mr.    Tom            Siambi        Program Officer              African Centre for Women,           Kenya         tsiambi@acwict.org
                                                                 Information and Communications
                                                                 Technology
Dr.    Damary         Sikalieh      Associate Professor of       United States International         Kenya         sikalieh@yahoo.com
                                    Business and Management      University - Africa
Mr.    Dennis         Simiyu        Monitoring & Evaluation      TechnoServe, Kenya                  Kenya         dsimiyu@technoserve.or.ke
                                    Manager
Mr.    Munshi         Sulaiman      Coordinator, Research        BRAC                                Bangladesh    m.sulaiman@lse.ac.uk
Mrs.   Angelica       Towne         Program Director             Educate!                            Uganda        angelica@experienceeducate.org
Mrs.   Dinah          Tumuti        Professor                    Kenyatta University                 Kenya         director-coep@ku.ac.ke
Mrs.   Kristy         Vanderplas    Program Coordinator          Street Kids International           Switzerland   kristy@streetkids.org
Mrs.   Alice          Wadugu        Ag. Managing Director Poly   Mombasa Polytechnic University      Kenya         wadugu2000@yahoo.com
                      Ondigo        . Enterprises                College
Mr.    George         Waigi         Country Coordinator          Youth Entrepreneurship Facility,    Kenya         waigi@ilo.org
                                                                 ILO
Ms.    Stefanie       Weck          Intern                       Youth Employment Network            Switzerland   StefanieD.Weck@web.de
Mrs.   Katharina      Wespi                                      Swiss Academy for Development       Switzerland   wespi@SAD.ch

Ms.    Lisa           Whitley       Program Economist            USAID Kenya                         Kenya         lwhitley@usaid.gov




                                                                                                                                               24
 ANNEX 3: Agenda
Day 1 – Monday 22 November
Time                Topic                                                                      Presenter
8:30 - 9:00         Registration / Check-in
9:00 – 09:30        Round of Introductions                                                     Drew Gardiner, YEN
(Plenary Session)      - Socio metric introduction: connect participants and topic
                       - Introduction to moderators
9:30 - 9:50         Opening and Welcoming Remarks                                              Markus Pilgrim, YEN
(Plenary Session)                                                                              Jens Dyring Christensen, YEF
                                                                                               Constance Lullies, Jacobs Foundation
9:50 – 10:00         Presentation of programme                                                 Markus Pilgrim, YEN
10:00 – 10:30        Coffee Break
10:30 - 11:15        Why Impact Evaluation?                                                    Imran Rasul, UCL
(Plenary Session)        - Why evaluate?
                         - The impact of impact evaluations
                         - How to implement an impact evaluation? - introduction
11:15 – 12:00        Real world evaluations: Evaluating Youth Entrepreneurship Programmes      Susana Puerto, YEF/YEN, and Veronica Chau,
                         - Examples from developing countries                                  Dalberg
                         - Using evaluation evidence
12:00 - 13:30        Lunch
(Plenary Session)
13:30 – 15:00        Project Marketplace                                                       Drew Gardiner, YEN
(Plenary Session)    Live case studies introduce projects and evaluation plans. Participants
                     rotate amongst project presentations.
15:00 – 15:20        Refresher session on Results Chain                                        Markus Pilgrim, YEN
(Plenary Session)        - Monitoring and evaluation: different but complementary
                         - Building a results chain
15:20 – 15:30        Introduction to Live Consultations and Concept Note                       Markus Pilgrim, YEN
                     Introduce case studies to their moderators and requirements of group
                     work
15:30 – 16:00        Coffee Break
16:00 – 17:30        Parallel Live Consultations (Part 1): Results Chain and Indicators        Group moderators with evaluation taskforce
(Group Session)      Groups begin work on their concept notes: building the results chain
17:30 –              Debrief and End of Day 1                                                  Markus Pilgrim, YEN


                                                                                                                                       25
Day 2 – Tuesday 23 November
Time                  Topic                                                                         Presenter
9:00 - 9:10           Introduction to Day 2                                                         Markus Pilgrim, YEN
(Plenary Session)
9:10 - 10:30          Measuring Impact                                                              Nathan Fiala, World Bank
(Plenary Session)          - Overview of different evaluation methods
10:30 – 11:00         Coffee Break
11:00 – 12:00         Fictional Case Study                                                          Vera Chiodi, J-PAL
                           - Group work on selection methods through an evaluation case
                                study
12:00 - 14:30         Lunch & Visit to SOS Children’s Village
14:30 - 17:30         Parallel Live Consultations (Part 2): Research Question, Indicators & Data    Group moderators with evaluation taskforce
(Group Session)       Groups focus on the research questions to be answered by the impact
Coffee Break          evaluation as well as the corresponding indicators and sources of data.
included              Initial discussions on identification strategy will take place.
17:00 –               Debrief and End of Day 2

Day 3 – Wednesday 24 November
Time                       Topic                                                                            Presenter
9:00 - 9:10                Introduction to Day 3                                                            Markus Pilgrim, YEN
(Plenary Session)
9:10 - 12:00               Parallel Live Consultations (Part 3): Selection of Valid Method                  Group moderators with evaluation
(Group Session)            Groups work in devising the identification strategy of the evaluation.           taskforce
Coffee Break included      Sampling, rollout, and other design and operational issues will be
                           discussed.
                           Group should finalize concept note presentations.
12:00 - 13:30              Lunch
13:30 – 15:30              Summary: What do evaluation plans look like?                                     Markus Pilgrim, YEN
(Plenary Session)              - Presentation of concept notes from the 5 live case studies.
15:30 – 16:00              Coffee Break
16:00 – 16:30              Next steps                                                                       Markus Pilgrim, YEN
                               - What are the next steps for the projects?
                               - How to access financial funding?
                               - What support to expect from taskforce?
16:30 -                    Clinic’s evaluation and closing                                                  Markus Pilgrim, YEN


                                                                                                                                           26
Annex 4: Impact evaluation resources

    •   Wikipedia on impact evaluation

    •   Judy Baker 'Evaluating the Impact of Development Projects on Poverty: A Handbook for
        Practitioners'

    •   Howard White (2007) 'Evaluating Aid Impact', Research Paper No. 2007/75, UNU-WIDER
    •   Rebekka E. Grun, 'Monitoring and Evaluating Projects: A step-by-step Primer on Monitoring,
        Benchmarking, and Impact Evaluation'

    •   Michael Bamberger (2006) 'Conducting Quality Impact Evaluations Under Time and Budget
        Constraints', World Bank, Washington USA.

    •   Martin Ravallion (1999) 'The mystery of the vanashing benefits: Ms Speedy Analyst's
        introduction to evaluation', Policy Research Working Paper 2153, World Bank, Washington,
        USA




                                                                                                   27

								
To top