Internal Oversight Service Evaluation Section IOS/EVS/PI/72 Original: English Evaluation of UNESCO’s Results-Based Management Training Programme External evaluation team: Thierry Senechal and Lee Mizell March 2007 The views and opinions expressed in this document are those of the author and do not necessarily represent the views of UNESCO or of the IOS. The designations employed and the presentation of material throughout this document do not imply the expression of any opinion whatsoever on the part of UNESCO concerning the legal status of any country, territory, city or area of its authorities, or concerning its frontiers or boundaries. UNESCO Evaluation of UNESCO’s Results-Based Management Training Programme By Thierry Senechal and Lee Mizell Final Report 14 March 2007 This report is established by request of UNESCO’s Internal Oversight Service (IOS). The views expressed in this report are those of the consultants and do not reflect the official opinion of IOS. This report has been prepared for the use of IOS in connection with its evaluation of the Results-Based Management Training Program. INDEPENDENT EVALUATION SOLUTIONS Route de St-Cergue 15 CH-1260 Nyon Switzerland Office: +41 22 360 8090 Fax: +41 22 360 8092 Contact person: Thierry Senechal, Director firstname.lastname@example.org Tel. +33 624 28 51 11 Legal status: Swiss limited company registered in the Canton of Vaud under federal number CH-550-1043885-0 1 TABLE OF CONTENTS ABBREVIATIONS AND ACRONYMS ..............................................................3 1 EXECUTIVE SUMMARY .............................................................................4 1.1 CONTEXT AND PURPOSE OF THE EVALUATION .......................................................................4 1.2 METHODOLOGY OUTLINE .......................................................................................................4 1.3 MAJOR FINDINGS AND RECOMMENDATIONS...........................................................................5 2 INTRODUCTION........................................................................................10 2.1 BACKGROUND AND CONTEXT ...............................................................................................10 2.2 ORGANIZATION OF THE REPORT ...........................................................................................11 3 POLICY AND MANAGEMENT..................................................................13 3.1 OVERVIEW ............................................................................................................................13 3.2 PROGRAMME DEVELOPMENT ................................................................................................13 3.3 WHO WAS TRAINED ..............................................................................................................15 3.4 PROGRAMME CONTENT .........................................................................................................17 3.5 PARTICIPANT SELECTION ......................................................................................................19 3.6 TIMING OF THE TRAINING .....................................................................................................20 3.7 RESOURCE MOBILIZATION ...................................................................................................21 4 QUALITY AND EFFECTIVENESS ............................................................24 4.1 MONITORING AND EVALUATION...........................................................................................24 4.2 THE PARTICIPANT EXPERIENCE.............................................................................................27 5 RBM ACTIVITIES OF OTHER AGENCIES...............................................35 5.1 BACKGROUND.......................................................................................................................35 5.2 RBM AT THE UNITED NATIONS............................................................................................35 5.3 IMPLEMENTING RBM IN DIFFERENT ORGANIZATIONS: LESSONS LEARNED .........................37 5.4 CONCLUSION: SUMMARY OF KEY FINDINGS .........................................................................44 6 IMPACT OF THE TRAINING PROGRAMME ...........................................46 6.1 EVIDENCE FROM OFFICIAL DOCUMENTS ...............................................................................46 6.2 EVIDENCE FROM THE INDEVAL PARTICIPANT SURVEY .....................................................51 6.3 EVIDENCE FROM KEY-INFORMANT INTERVIEWS ..................................................................52 7 MAJOR FINDINGS AND RECOMMENDATIONS ....................................54 ANNEXES........................................................................................................59 8 TERMS OF REFERENCE .........................................................................60 8.1 BACKGROUND INFORMATION ...............................................................................................60 8.2 PURPOSE OF THE EVALUATION .............................................................................................60 8.3 PROCEDURES AND METHODS ...............................................................................................61 8.4 THE EXTERNAL EVALUATOR / CONSULTANT.......................................................................63 8.5 EVALUATION BUDGET ..........................................................................................................63 2 8.6 TIMEFRAME ..........................................................................................................................63 8.7 DELIVERABLES OF THE EVALUATION ...................................................................................63 9 METHODOLOGY.......................................................................................64 9.1 INTRODUCTION .....................................................................................................................64 9.2 METHODOLOGY ....................................................................................................................64 9.3 TIMETABLE ...........................................................................................................................69 9.4 LIMITATIONS OF THE ANALYSIS............................................................................................70 10 LIST OF PERSONS INTERVIEWED.........................................................71 11 GUIDING QUESTIONS FOR SEMI-STRUCTURED INTERVIEWS .........72 12 INDEVAL RBM TRAINING PARTICIPANT SURVEY ..............................74 12.1 EMAIL INVITING PARTICIPATION IN THE SURVEY ................................................................74 12.2 EMAIL REMINDER FOR PARTICIPANT SURVEY .....................................................................75 12.3 SURVEY INSTRUMENT AND SUMMARY RESULTS ..................................................................75 13 RBM TRAINING SCHEDULE 2003-2005 .................................................83 14 GENERAL BIBLIOGRAPHY .....................................................................84 15 BIBLIOGRAPHY FOR THE RBM SECTION ............................................87 15.1 INTERNAL REPORTS AND DOCUMENTS..................................................................................87 15.2 INTERNET LINKS CONSULTED: ..............................................................................................89 16 END NOTES FOR THE RBM SECTION ...................................................91 3 ABBREVIATIONS AND ACRONYMS BSP Bureau of Strategic Planning IOC Intergovernmental Oceanographic Commission IOS Internal Oversight Services (UNESCO Paris) JIU Joint Inspection Unit Organisation for Economic Co-operation and OECD Development RBM Results-based management System of Information on Strategies, Tasks and the SISTER Evaluation of Results UNDAF United National Development Assistance Framework UNDP United Nations Development Programme United Nations Educational, Scientific and Cultural UNESCO Organization UNFPA United Nations Population Fund 4 1 EXECUTIVE SUMMARY 1.1 Context and purpose of the evaluation There is a trend among public sector institutions towards results-based management (RBM). At the end of last decade, UNESCO introduced RBM tools with the aim of improving programme and management effectiveness, accountability, and achieving results. As part of an overall reform process, UNESCO launched a comprehensive training initiative to orient staff to results-based management and its application to programme planning. This evaluation assesses the RBM training programme developed and implemented by the Bureau of Strategic Planning (BSP) from June 2003 to November 2005. The overarching goal of this evaluation is to assess the training with respect to management, quality, and impact. Toward this end, the evaluation has four objectives: 1. Objective 1: Assess the policy and management structure associated with the RBM training programme 2. Objective 2: Assess quality and effectiveness of the RBM training programme 3. Objective 3: Assess the impact of the RBM training programme 4. Objective 4: Provide options for UNESCO to further develop and improve the RBM training programme 1.2 Methodology outline This evaluation employs four methodological tools: 1) an extensive desk review of UNESCO documentation and RBM training material, 2) approximately 25 interviews with the RBM training team, associated consultants, senior management, staff, and delegations at the Paris headquarters, 3) a web-based survey of participants in the RBM training programme, and 4) an analysis of programme evaluation data provided by BSP (see Annex 2 for a complete description of research methods). Certainly, this methodology has limitations. First, the evaluation focuses only on RBM training developed and implemented by the Bureau of Strategic Planning (BSP) from June 2003 to November 2005; no review of former or current RBM training activities was conducted. Second, it relies on a purely observational design. Without employing a “control group” of some sort (e.g. a matched comparison group), it is not possible to assert that participation in the RBM training caused any particular outcomes. Moreover, it will be difficult to separate out potentially confounding factors that could explain positive or negative outcomes in the area of RBM utilization. Lastly, the time frame for conducting this evaluation was very short, most of the work having been conducted from mid-May to mid-June 2006. 5 1.3 Major findings and recommendations 1.3.1 Advantages Upon our review and analysis, the key findings can be summarized as follows: - RBM culture has been enhanced. UNESCO has been moving towards results-based management (RBM) by introducing policy and programme changes to improve the quality and increase the impact of its programmes. The introduction of the RBM training has facilitated this move as evidenced by the use of RBM concepts in work plans and other work activities, as well as the improved formulation of results in the C/5 noted by the Executive Board. - Participants are generally satisfied with the training programme. Trainees demonstrate a general satisfaction with the RBM training as well as initiative regarding the application of RBM concepts. Generally, participants feel more knowledgeable about the formulation of their inputs into the planning process. - Specialized assistance on RBM is valued. Staff members appear to value the opportunity to learn about results-based management. At the same time, they seek personalized assistance and opportunities to gain hands-on experience with RBM concepts as they apply to work plans and other planning documents. This is provided and valued in the formal RBM training sessions. The RBM training complements other programming services already provided by BSP. - The mix of formal training and coaching helps bridge theory and practice. Results-based management concepts are not always easy to understand. For many, RBM is a new way of thinking about programme planning, implementation, and assessment of progress. Offering training that provides a solid overview of RBM theory, as well as opportunities for hands- on work – either through group work or through personalized coaching – bridges theory and practice. 1.3.2 Challenges While the overall assessment is largely positive, the training programme does face important challenges that should be noted. They are grouped into two categories: 1) programme management and 2) training implementation and content. The recommendations are intended to be relatively low-cost and easy to implement. Important but lesser priority recommendations are indicated by (*). 6 Programme Management - Coherent and predictable operation of the RBM training programme is critical for long-term success. Despite hard work and enthusiasm, important aspects of managing the programme fell short, hampering its effectiveness. In particular, creation and/or approval of key programme planning documents such as an approved work plan and budget were delayed and formal mechanisms for supporting the RBM training team were somewhat weak. ⇒ Recommendation: (1) Develop, maintain, and monitor a comprehensive work plan for the RBM training programme, complete with performance indicators and a communications strategy for informing field offices and headquarters staff about the availability of RBM training. - Limited human and financial resources may weaken the long-term sustainability of the RBM training programme. Although an in-depth assessment of resource availability and utilization was not possible, the budget for the training programme appears to have been somewhat limited and the staffing pattern appears to have been somewhat light. Looking forward what is needed is stability, coherence, and predictability of resources. ⇒ Recommendations: (1) A Financial Sustainability Plan could be prepared for the next budget cycle. This document would assess the financing challenges facing training programme, and would describe the unit’s approach to mobilizing and using resources to support programme objectives. (2) A “Donor Reference Group,” comprised of some delegations, could be explored and would allow donor governments to exchange views on RBM culture at UNESCO and make financial commitments. Training content and implementation - Participants’ skills have improved but these skills need to be enhanced to increase correct use of RBM concepts. Despite the generally positive assessment offered by participants, it becomes clear that additional skill- building is needed to help staff formulate results for work plans and to develop performance indicators. Both interviews and survey data highlighted the importance of follow-up to refine skills and the willingness of staff to participate in ongoing training. Staff seek opportunities to practice applying RBM concepts to specific work tasks. Many felt practical exercises were lacking as part of the training programme. ⇒ Recommendations: (1) Continue to offer the RBM training programme, but ensure that learning needs are clearly assessed and materials are targeted to those needs prior to 7 embarking on another round of trainings. More than one set of training materials could exist (e.g. introductory, intermediate, and advanced). (2) Review the training materials with an eye to practicality. Update materials with new examples and practical exercises (e.g. sector-specific examples, applications of “good practice”, mistakes to avoid, etc.) (3) Provide formal, structured, targeted training (e.g. coaching) to staff involved with formulating results at the MLA-level, drawing on both the RBM training staff and well-trained BSP sector liaisons as coaches. (4) Offer the three-day formal RBM training sessions at headquarters, which has only received coaching sessions and partial trainings to date. (5) Provide self-learning tools for staff, perhaps by developing a comprehensive intranet site that uses the existing PowerPoint materials to develop an “e-learning” course, complimented by other practical materials such as templates, worksheets, and reference materials. Manage and update the site regularly. If resources permit, the e-materials could also be turned into a CD-ROM for distribution in field offices that have bandwidth constraints or other Internet access challenges. (6) * Incorporate RBM training documentation (perhaps from the e-learning materials described above) into the online orientation materials currently provided to new staff. - For RBM to be applied effectively, training must take “intra-UNESCO” differences into account. Not only do data suggest that the training sessions need to have a more practical (“how-to”) orientation, but that specific attention needs to be paid to the different approaches that different sectors must take to formulating expected results. ⇒ Recommendations: (1) Collaborate with sector-staff and BSP sector liaisons to tailor the training materials to sector-specific circumstances by modifying examples, exercises, and handouts depending on the audience and their needs. - Quality control and monitoring of programme outcomes could be enhanced. The RBM team did make efforts to monitor important aspects of quality, particularly participant satisfaction. Post-participation evaluations were useful in this regard, but could have been improved to enhance the ability of BSP to monitor changes in learning and behaviors. 8 ⇒ Recommendations: (1) Collect comprehensive and reliable data on programme participants (e.g. name, sector, professional grade, etc.) in order to know the characteristics of who participates. (2) Use a pre/post survey design with a carefully constructed instrument to assess both participant satisfaction and changes in knowledge. (3) Collect data from all RBM trainees, including coaching sessions, to provide BSP with useful information about the different categories of assistance being offered. (4) Link data collection with performance indicators associated with a well- developed work plan to enhance programme monitoring and enable more strategic mid-course adjustments if necessary. (5) * Enhance quality control by participating in (or building) expert networks, particularly within the UN system. 1.3.3 Opportunities - Looking forward, there is an opportunity to enhance the mainstreaming of RBM throughout organizational work processes. When asked if they use RBM concepts in work processes other than developing work plans, 74% of survey respondents said they did so. They offered a comprehensive list of applications, ranging from meeting participation to proposal writing to staff performance assessment. In addition, interviews reveal support for a more ambitious RBM culture at UNESCO, particularly to improve reporting of results and monitoring programme activities. ⇒ Recommendations: (1) Develop a conceptual framework for RBM at UNESCO to be applied throughout the organization that promotes common understanding and concepts, and that clearly extends the vision for RBM beyond the formulation of results for the C/5. BSP is expected to release guiding principles for RBM at UNESCO which could make a valuable contribution in this regard. (2) * Recognize and reinforce use of RBM concepts throughout the organization by incorporating “profiles of success” in training materials, staff newsletters, and on an RBM intranet web site. - There are opportunities to build ownership of RBM: There is some evidence from the INDEVAL participant survey that ownership of RBM concepts is growing at UNESCO. However, without a clear mechanism for 9 monitoring, staff need not reflect on the work plans regularly and may not see work plans as a useful tool. Staff need to feel ownership of their work plans and their expected results in order to see the utility of the RBM training and to use it. ⇒ Recommendation: (1) Extend the use of RBM to monitoring and evaluation. “Monitoring for results” training might be considered an intermediate RBM course. If resources are not immediately available for providing such training, BSP might consider offering workbooks, templates, or other instructional materials to help staff further develop their ability to identify and use performance indicators. 10 2 INTRODUCTION 2.1 Background and context Public sector management has changed notably over the last forty years, with emphasis shifting from budgets (what is spent) to activities (what is done) to results (what is achieved).1 This shift has been observed not only among governments, but among international organizations as well. Various management practices have emerged to improve accountability, transparency, and performance. Results-based management (RBM) evolved in this context. As a management strategy, “[i]t’s [sic] primary purpose is to improve efficiency and effectiveness through organisational learning, and secondly to fulfil accountability obligations through performance reporting.”2 By the late 1990s, many of the United Nations organizations had turned to results-based management as a tool for improving performance, UNESCO among them.3 In 1999/2000, UNESCO initiated a reform process aimed at “rethinking UNESCO’s priorities and refocusing its action, streamlining its structures and management procedures, revitalizing its staff and rationalizing its decentralization policy.”4 It is no surprise, then, that results-based programming, management, and monitoring was introduced as a component of the reform process. Although RBM was formally introduced at UNESCO through the 1999/2000 reform initiatives, interest in results-based management already existed among some staff. A group of approximately 80 individuals met regularly to discuss and recommend ways to introduce RBM to the organization. This was managed by a senior officer in BSP responsible for the development of SISTER and knowledge management in general. This group, informally called “About Results,” put forward ideas which contributed to the development of the System of Information on Strategies, Tasks and the Evaluation of Results (SISTER), a management and reporting device designed to facilitate the introduction of RBM. The Director General referred to SISTER in his speech at the closing ceremony of the 30th session of UNESCO’s General Conference in November 1999: I welcome the fact that the beginning of my term of office should coincide with the introduction of the System of Information on Strategies, Tasks and Evaluation of Results, SISTER. Its very title speaks for itself inasfar as it aims to set results, define and then implement strategies and tasks to reach them, and constantly evaluate the state of progress so as to be able to adjust the strategies to the 1 Meier, Werner. “Results-Based Management: Towards a Common Understanding Among Development Cooperation Agencies.” Discussion Paper, v5, Ottawa: Results-Based Management Group, 2003, 3-4. 2 Meier, “Results-Based Management,” 6. 3 Ortiz, Even Fontaine, et al. “Overview of the Series of Reports on Managing for Results in the United Nations System.” Geneva: Joint Inspection Unit, 2004. JIU/REP/2004/5, 3 4 “Report by the Director-General on the Reform Process.” Paris: UNESCO, 2003, 32 C/32, 1. 11 identified outcome. What the title does not say … is that it also calls for a new mindset: all staff at their various levels will need to contribute together to drawing up the results at the relevant levels, thus requiring permanent dialogue between officers-in-charge. This will bring a keener sense of responsibility-sharing and greater collective efficiency … It should instill a new working spirit within the Secretariat, combining responsibility, accountability, trust, communication and a better understanding of common objectives.5 In response to the introduction of this new management tool, a variety of approaches were taken to develop RBM capacity among staff at UNESCO. Principal among these efforts was the engagement of external consultants to train staff in the use of log- frame analysis and to develop an RBM orientation manual specifically for UNESCO to build capacity and to accompany SISTER. The goal of these efforts was to enhance the results-orientation of the 32 C/5 biennial programme and budget. Despite some progress in this regard, by April 2003 there was a clear desire to enhance efforts to mainstream RBM throughout the organization, particularly for the development of the work plans required to implement the 32 C/5. Toward this end, the Executive Board encouraged the creation of a dedicated training programme at its 166th session, a decision subsequently endorsed by the General Conference at its 32nd session: The Executive Board…Notes with satisfaction the considerable progress achieved so far in the programming, presentation and results orientation of document 32 C/5 … which represent a major advance in the introduction of results-based programming, management and monitoring (RBM). … Acknowledges that the introduction of RBM at Headquarters and in the field is an ongoing process, which needs to be extended, in particular to the work plans implementing document 32 C/5, and to that end invites the Director-General to include specific training in the staff training programme, to which Member States are also invited to contribute extra budgetary resources.6 The Bureau of Strategic Planning (BSP) responded to this mandate by developing and implementing a multi-faceted RBM training programme between June 2003 and November 2005. This programme, the subject of this evaluation study, is described in detail in the proceeding section. 2.2 Organization of the report The remainder of this study provides a comprehensive assessment of the RBM training programme. Chapter 3 provides a detailed description of the design and implementation of the training programme. It addresses questions related to programme policy and management. Chapter 4 addresses issues of programme quality and effectiveness. Specifically it describes the RBM training material, the programme monitoring system, and the experiences of participants. Chapter 5 looks specifically at 5 “Speech of the Director-General at the Closing Ceremony of the 30th Session of UNESCO's General Conference.” Paris: UNESCO, 17 November 1999. 6 “Decisions Adopted by the Executive Board at its 166th Session.” Paris: UNESCO, 2003. 166 EX/Decisions, 19-20. 12 the impact of the training programme and asks: Has the programme made a difference? Is UNESCO better off today than it would have been without the programme? The conclusions derived from each of the preceding sections are presented in Chapter 6, which offers both conclusions and recommendations. Finally, readers are encouraged to review the comprehensive set of materials provided in a series of annexes. These materials offer important insights regarding research methods and findings. 13 3 POLICY AND MANAGEMENT 3.1 Overview The RBM training programme in the Bureau of Strategic Planning (BSP) was launched in June 2003. The creation of the training programme was consistent with the new priority placed on staff training at UNESCO. While relatively few opportunities existed in the past, today staff training is seen as “central to both individual career development and the interests of the organization.”7 The RBM training programme was created to respond to a variety of organization needs. First, and foremost, the goal of the training programme was to produce “significant improvements in the results components of work plans for document 32 C/5, future monitoring and reporting documents, the C/3 and C/5 documents.”8 It was also intended to create “a ‘results culture’ within UNESCO, bringing about a constant improvement in the results formulation and reporting.”9 Accomplishing these results would require not only skill-building, but also developing a new mindset oriented toward identifying, monitoring, and incorporating information regarding results. 3.2 Programme development Between June and September 2003, energy was dedicated to developing the training programme. Activities included an assessment of needs, a review of relevant literature, a review of materials from other RBM training programmes at international organizations, incorporating lessons learned regarding RBM training from other agencies, and the creation of teaching materials for UNESCO. Unfortunately, while the training programme is intended to support the internal reform process, aside from the specifications of the C/5, there appears to be no clear conceptual framework for implementing RBM at UNESCO. Such a conceptual framework, recommended by the Joint Inspection Unit for the implementation of RBM programmes throughout the United Nations, could have been useful to guide the development of the BSP RBM training programme in its early stages.10 By the autumn of 2003, BSP had developed a pilot training programme to be launched in the field. The first test of this programme came in September at a three-day seminar (15-19 September) organized by the Bangkok Regional Office. While the purpose of the regional meeting was not RBM training, the BSP team was able to conduct a one- day workshop as part of the larger seminar schedule. This first training was unique because it was conducted as an introduction to RBM and was offered as one presentation at a regional meeting dedicated to other topics. Moreover, the group was much larger than typical workshops (more than 70 people) and the time provided for RBM orientation was very short. 7 The training budget has increased progressively from $400,000. It will continue to rise until it corresponds to 3% of UNESCO’s staff costs. Source: “Report by the Director-General on the Reform Process.” Paris: UNESCO, 2003. 32 C/32, 5. 8 “Report by the Director-General,” 5. 9 “Report by the Director-General,” 5. 10 Ortiz, et. al, “Overview,” 6. 14 While the Bangkok training proved to be more of a “sensitization session” than a proper training, it was an opportunity to test the programme, gather feedback from participants, and gauge interest and abilities regarding results-based management. Lessons learned from this experience were used to adjust the programme and materials for the subsequent three-day workshop offered to Directors of UNESCO Cluster offices in West Africa in Dakar. In November 2003, the RBM team invited an external consultant to attend the 3-day training session held in Quito, Ecuador to assess content and teaching methods to ensure that the programme met recognized standards in terms of quality and effectiveness.11 The consultant from the Canadian consultancy firm Le Groupe- conseil baastel ltée spent four days in Quito, observed the sessions, their delivery, the materials presented, the use of a post-participation survey, and discussed the experience with participants. Overall, the findings were highly positive. The training programme continued to be revised and adjusted through March 2004. By March, the team had developed a relatively stable 3-day training programme to offer at both field offices and headquarters. The initial phase not only saw the completion of the training materials, but also the addition of a third full-time member of the RBM team to provide administrative and logistical support for training activities. The following sections describe the audience, content, staffing, and budget of the training programme in detail. 11 The Canadian consultancy firm Le Groupe-conseil baastel ltée was selected by BSP in consultation with the Government of Canada 15 3.3 Who was trained Figure 1: RBM Training Participants The target audience for the RBM training by Year, 2003-2005 program was envisaged to be divisions and field office directors, professional staff, 250 general staff and national staff working on Full training Coaching the programming and on the Partial training implementation, monitoring and reporting of the Programme. 200 177 174 How many training workshops were conducted? Between June 2003 and November 2005, the Bureau of Strategic 150 142 Planning conducted 29 RBM training 99 sessions for UNESCO staff, both at field 38 offices and at headquarters.12 In all, a total 130 of 493 individuals registered for the 100 trainings, 371 from field offices and 122 from headquarters.13 It is an impressive achievement for a training staff of 50 104 essentially two individuals. 65 22 All participants from headquarters 22 attended either a partial training (35) or a 13 0 coaching session (87). By contrast, both 2003 2004 2005 in-depth 3-day and 1-day training sessions were held in the field. In all, 267 field staff Source: UNESCO BSP attended 3-day sessions, and 104 attended Note: Includes 31 individuals who attended more than one training and a small one-day sessions. number of individuals who were registered for, but did not attend aThe data available regarding who participated in the RBM training partial training or coaching session. programme are rather thin. Location, date, duration, and the names of registered participants are available for each training session but, information regarding the sector, position, or grade of attendees is not available. Having this data would have allowed BSP to determine if the training programme was targeting the right “audience” at the right time, given the distribution of responsibilities throughout the C/5 planning cycle. 12 An additional training for 15 staff from of the Palestinian Authority was held in Ramallah in 2004. 13 These figures include 31 people who attended more than one training session, as well some individuals who were registered for but did not attend the training. It excludes the 5 participants from UNESCO national commissions and the UNWRA, as well as the 15 participants in Ramallah. 16 Table 1: Overview of RBM Training Offered by Year and Location, 2003-2005 UNESCO LOCATION MONTH LEVEL LENGTH LOCATION TYPE TRAINERS/FACILITATORS STAFF 2003 Bangkok (1) Sept. 72 Regional 1 day Field Partial training B. Lefevre, T. Fiorilli Bangkok (2) Sept. 13 Office 1 day Field Partial training B. Lefevre, T. Fiorilli Dakar (1) Sept. 7 Sub-regional 3 days Field Full training material B. Lefevre, T. Fiorilli Dakar (2) Sept. 19 Office 1 day Field Partial training B. Lefevre, T. Fiorilli Quito (1) Nov. 12 Regional 3 days Field Full training material B. Lefevre, T. Fiorilli Quito (2) Nov. 19 Office 3 days Field Full training material B. Lefevre, T. Fiorilli 2004 Addis Ababa Jan. 7 Sub-regional 3 days Field Full training material B. Lefevre, T. Fiorilli Nairobi Jan. 12 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Cairo Mar. 17 Office 3 days Field Full training material B. Lefevre, T. Fiorilli IOC (1) Mar. 13 Division 1 day Headquarters Partial training B. Lefevre, T. Fiorilli Kingston Apr. 14 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Amman May 15 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Beirut June 18 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Windhoek Jul. 16 Office 3 days Field Full training material B. Lefevre, T. Fiorilli IOC (2) Nov. 22 Division 2 days Headquarters Coaching B. Lefevre, T. Fiorilli Science Sector Nov. 43 Sector 1 day Headquarters Coaching B. Lefevre, T. Fiorilli + external consultant 2005 Teacher Training Jan. 9 Division 1 day Headquarters Coaching B. Lefevre, MV Garcia Benavides, J. Dorn Bureau of Budget May 6 Unit 1 day HQ Partial training B. Lefevre, T. Fiorilli, J. Dorn Jakarta (1) July 13 Office 3 days Field Full training material B. Lefevre, J. Dorn Jakarta (2) July 12 Office 3 days Field Full training material B. Lefevre, J. Dorn Dakar Aug. 15 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Bangkok (1) Sept. 13 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Bangkok (2) Sept. 19 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Doha Sept. 8 Office 3 days Field Full training material B. Lefevre, T. Fiorilli Montevideo Nov. 21 Office 3 days Field Full training material B. Lefevre, T. Fiorilli, M.V. García Benavides Beijing Nov. 19 Office 3 days Field Full training material B. Lefevre, T. Fiorilli, A. Keller Bonn Nov. 10 Center 3 days Field Full training material B. Lefevre, A. Keller CLT Coaching 13 Division 1 day Headquarters Coaching B. Lefevre, MV Garcia Benavides, J. Dorn Primary ED 16 Unit .05 days Headquarters Partial training B. Lefevre, T. Fiorilli, J. Dorn 17 3.4 Programme content Three categories of training were offered to UNESCO staff as part of the RBM training programme: a 3-day intensive training course, 1-day introductions and partial trainings, and personal coaching sessions. Follow-up assistance was provided to participants by email and telephone as needed. 3.4.1 The 3-day training course The core element of the training programme was a 3-day workshop consisting of seven presentations/modules (in 2004) or nine presentations/modules (in 2005) on results-based management. The content of the RBM training appears to have been the subject of some debate within BSP. In particular, finding the right balance between theory and practice seems to have been a struggle. Early training sessions tended to provide more RBM theory than later sessions. As a result, the number of modules and the content of the 3-day training sessions changed somewhat between 2004 and 2005. An overview of the modules is provided in Table 2. Table 2 : RBM Training Modules, 2004 and 2005 Module Windhoek 2004 Jakarta 2005 Objectives, agenda and organisation Objectives, agenda and organisation 1 of the workshop - Introduction of the workshop - Introduction 2 What is RBM? What is RBM? Basic concepts of RBM: Performance 3 Why RBM? Framework 4 Performance measurement framework Results chain 5 How do you select activities? Results chain in UNESCO Managing, monitoring, reporting and Performance Framework Designing 6 evaluation an intervention Managing, monitoring, reporting and 7 Focusing on results evaluation 8 Check list Performance Framework 9 Performance measurement framework Special session: Ensuring sustainability of the results of the workshop: next steps Source: “RBM training programme: Focusing on results” Training programme agendas from Windhoek UNESCO Office (July 6-8, 2004) and Jakarta UNESCO Office (July 19-21, 2005) Training sessions were generally conducted for groups of 20 individuals or less, in order to provide participants with personalized attention. When more individuals needed to be trained, a second training was often offered (e.g. Bangkok and Jakarta, 2005). 18 A total of 24 hours were dedicated to RBM skill-building over the course of the 3-day sessions, of which nearly 13 hours were dedicated to working group sessions in which participants applied RBM concepts to work plans with the supervision and assistance of trainers. Work plans were developed using materials that participants were asked to prepare in advance and bring to the training. In most cases the hands-on work was done on the basis of existing work plans. In 2004, participants were asked to complete three exercises prior to the beginning of the training sessions. The instructions provided to the participants indicate that, in addition to theory, the training aimed to build UNESCO-specific RBM skills that would have a direct and positive impact on planning documents. Specifically, they were asked to:14 1) “Kindly prepare at your earliest convenience FOUR activities of your choice for the Office that you would like to work on and improve during the workshop. These could be, for example, representative of the various Sectors. These should be activities that you will update in SISTER at the end of the workshop… We will analyse the activities sent to us before arrival and take these samples as one of the starting point [sic] for the workshop as far as its design and formulation component is concerned. … 2) Please find attached the template sent by BSP/SISTER on October 11th, 2004. You may wish to review the work plans you are responsible for and to assess if they are complete and of quality. Please try to identify the potential weaknesses of the work plans and the key problems you encountered while preparing them. You do not need to answer these questions formally but please take some notes to participate in a discussion on this point at the beginning of the workshop. 3) Finally, Please [sic] prepare a brief informal intervention answering the following question: ‘How have you selected the 32 C/5 activities you are responsible for?’ … The purpose of this exercise is to understand what steps could be taken to prepare your Office Strategic Plan. We will share with you the lessons learnt so far.” In 2005, the participants were asked to complete only one task prior to the start of the training. They were asked to summarize four activities that could be used to prepare corresponding work plans for 2006. The degree of emphasis on practical skills varied substantially between field offices and headquarters. Field offices tended to receive the complete 3-day training programme which offered both an introduction to RBM concepts as well as hands-on applications, while training at headquarters focused almost exclusively on practical 14 UNESCO Bureau of Strategic Planning. Invitation letter to participants for the Windhoek training workshop, June 24, 2004. 19 skill-building through coaching. At each session, participants received a packet of materials (“workshop file”) that included: A workshop agenda; A list of participants; A copy of the pre-workshop letter sent to participants; and A copy of all of the PowerPoint slides (which constituted the majority of the materials provided). Prior to 2005, when changes were made to the training materials, participants also received the following elements as part of their workshop file: An extract from an Executive Board decisions regarding RBM; A memo from the Director General regarding preparation of the 32 C/5 (July 2003); A memo from BSP regarding training in RBM (August 2003); A note/technical guide from BSP on the preparation of work plans (August 2003); A guide to keying the work plan into SISTER; and A guide to key elements for formulating SISTER work plans and tips for incorporating RBM. After June 2005, in addition to elements one through four, the packet contained: A UNESCO RBM glossary; and An RBM working document for preparation of 33 C/5 activities 3.4.2 Partial Training and Coaching In addition to the full 3-day training course, the BSP/RBM team offered partial trainings (introductions) and coaching sessions for the preparation of work plans. While 3-day courses tended to be offered in field offices, partial training and coaching sessions were more likely to be offered at headquarters. The latter approach differed substantially from the formal training course. In particular, there was no standard format or material for the 1-day and half-day trainings or for coaching sessions. The exact content and structure of these workshops were jointly prepared between the sector, division, or unit receiving the training and BSP RBM. Individual and small group coaching occurred as part of the preparation for the C/5. Participants worked with the BSP/RBM trainers over a period of one or two days to develop and refine their materials for the C/5. Sectors then provided BSP with these materials for consideration. Unfortunately, feedback regarding this process was not wholly positive. While individuals felt that they had improved their work plans for the C/5, they often received 2nd round comments from BSP that contradicted the instructions provided by the RBM trainers. There was a perception that there was a disconnect within BSP regarding the definitions and applications of key RBM concepts. 3.5 Participant selection Between June 2003 and November 2005, RBM trainings were offered “on-demand” to offices and sectors that requested assistance. Although training is “demand-driven,” 20 there appears to have been little in the way of a formal communications strategy for alerting staff to the availability of trainings. Moreover, there appears to have been little strategic targeting of individuals, offices, or sectors. This is somewhat surprising given the specific short-term goal of the training programme: “significant improvements in the results component of work plans for document 32 C/5.”15 The training programme was offered first in field offices and then in headquarters. While this is consistent with the decentralization process and may have had the effect of creating a long-lasting, bottom-up RBM culture, in the short-term it was potentially less likely to produce results for the 32 C/5. An alternative approach would have been to provide intensive training within BSP, then at HQ, and then in field offices. The early emphasis on field offices also required that a substantial portion of the budget be dedicated to travel, money potentially spent on creating a more sustainable training infrastructure. This assessment is not meant to suggest that field office training was unimportant. In fact, RBM appears to have been enthusiastically received in field offices. Rather early emphasis, particularly in the pilot phase, could have been placed at HQ. Once offices were selected for training, individual participants were invited or nominated to participate. In some cases all field office staff were invited to attend the trainings. In other cases, invitations were extended to key staff (e.g. those using SISTER) or staff were nominated by the Executive Office or office director. Early on, emphasis was placed on organizing regional or sub-regional workshops for directors and then transitioned to training office staff involved with programming and implementation of programmes. Later, there appears to have been a desire to invite and train “focal points,” individuals who would return to their offices and share their knowledge with their peers. The challenge with the latter approach is that people 1) may not be sufficiently trained after one session to train others, 2) may not have the time to do so, or 3) may not be perceived as having the authority or mandate to change work processes. Given the importance of mainstreaming RBM at UNESCO, a “train the trainer” approach is only likely to work effectively if “trainers” receive intensive instruction, ongoing support from both BSP and their office director, and if their efforts are supported by a comprehensive, web-based, internal RBM training site that they and their colleagues can turn for additional support. Offering more than one level of RBM training (i.e. introductory, intermediate, and advanced) might be one approach to identifying and training RBM “enthusiasts” who could in turn train peers if a “train the trainer” approach were to be pursued. 3.6 Timing of the training As discussed, the timing of the classroom based trainings was demand-driven and opportunistic. This approach certainly has its drawbacks in terms of providing a predictable and fixed schedule of trainings. However, its advantage is that staff tends to be more motivated when trainers arrive if a request for assistance has been made or a need for training has been identified by the office director. Yet, there is an opportunity to develop a communications strategy that would “spread the word” about 15 “Report by the Director-General,” 5. 21 the availability of the training programme, generate demand, and permit BSP to develop a flexible but comprehensive training schedule. Comments from the INDEVAL survey highlighted the importance of not scheduling a training session during periods in which many individuals are likely to be on vacation, and providing sufficient advanced notice to participants. Developing and circulating a training schedule could help in this regard. Overall, however, trying to time the RBM trainings with the planning cycle is somewhat difficult. Level 5 and level 6 results can be revised on an ongoing basis. As such, training can and should be offered year-round to enhance RBM skills. However, results at the MLA-level are not subject to the same type of ongoing revision. These must be established at a particular time with a particular group of managerial staff. As such, there may be an opportunity to provide targeted, well-structured assistance (e.g. coaching) on results formulation at that level at specific points in time. The challenge is finding the right coaches to do the training because programmatic and sector know- how (versus general RBM knowledge) is important. One approach might be to train the BSP sector liaisons to participate formally in the coaching sessions. This may require “advanced level” training, as well as a formal structure for the coaching element of the RBM training programme. Providing specialized assistance to this managerial group should not preclude including the same group in ongoing, classroom-based RBM trainings. 3.7 Resource Mobilization 3.7.1 Human resources and partnerships BSP set out to develop the RBM training programme by establishing a special team within BSP that would be responsible for designing and implementing the programme. The initial team of consisted two individuals: a seasoned UNESCO staff member who played the lead role in designing and managing the training programme and a junior professional assigned to BSP. These individuals acted as the primary RBM trainers; an additional staff member and an intern were added in 2004 to assist with the coordination and implementation of the training program. The choice of an “all-UNESCO team” was a conscious one, based on previous experience with external consultants. While experiences had been generally positive, it was determined that external consultants knew too little about organizational dynamics to teach RBM concepts in a UNESCO context effectively. Moreover, with a mandate to train staff both at headquarters and in the field, it seemed prudent to recruit a team with related experience. Thus, the team leader for the training programme, recruited in April 2003, had worked at Headquarters and had served as Office Director in two field offices. While each member brought important skills to the team, none had pre-existing expertise in results-based management. Between June 2003 and November 2005, the members of the RBM team dedicated nearly 100% of their time to developing, coordinating, and implementing the training 22 programme.16 Overall, little formal support seems to have been extended to the RBM team in a manner that demonstrates an interest in long-term sustainability of the RBM training programme. For example, detailed job descriptions, professional training opportunities, and performance appraisals related to the training programme appear to have been lacking. These management tools are particularly important in light of the fact that the team was not assembled from a pool of RBM experts. When staff members do not bring strong expertise in their field of competence, regular training, supervision and evaluation is generally expected. While each member brought a unique strength to the team, without expertise or formal training in RBM, the challenge of developing and implementing the training programme for hundreds of UNESCO staff was undoubtedly made more challenging. Providing formal management support in the form of clearly defined job descriptions, opportunities for ongoing professional development in RBM, and performance appraisals linked to their ongoing work would have been helpful. The good news is that despite the absence of these supports, the team was able to implement a comprehensive programme that was positively reviewed by the vast majority of participants. For the most part, partnerships with other organizations do not appear to have been systematically developed, posing risks to the long-term sustainability of the RBM training programme. The training team visited various departments and agencies of the United Nations to understand how the RBM culture is implemented in various parts of the system. However, these partnerships were forged on an ad-hoc basis, primarily to exchange information. Stronger partnerships with RBM experts outside UNESCO, including but not limited to staff at other UN agencies, could help to ensure that 1) the training materials in use at UNESCO remain up-to-date, 2) feedback regarding concepts, teaching methods, and materials is provided on a continuous basis, 3) UNESCO is able to share its approach and success in a community of experts, and 4) that evaluation is conducted regularly by RBM experts who come to know the UNESCO system (a concern discussed previously). While UNESCO has specific RBM training needs and the training programme must be tailored to those needs, partnerships can enhance BSP’s ability to benchmark its approach against the best practices currently in use internationally. One exception regarding partnerships was the collaboration with the Canadian firm Baastel. At the encouragement of the Canadian government, BSP retained Baastel to assist with the early stages of programme development. In addition, Baastel was asked to assist with a training/coaching session provided to the Intergovernmental Oceanographic Commission (IOC). In cooperation with the RBM Team, they offered a one-day training for fifteen IOC staff as well as nine days coaching to IOC staff for the preparation of their contribution to the C/5. Baastel also provided five days coaching to the Science Sector staff (other than IOC) for the preparation of their RBM-based contribution to the C/5. 16 Interviews revealed that the RBM team may have been somewhat isolated from the rest of the BSP. With limited staff and all of their time dedicated to the training programme, it was difficult to participate in other BSP tasks. 23 3.7.2 Financial resources Assessment of the RBM training budget is difficult, particularly since comprehensive budget and expenditure documents specifically related to the RBM training in 2003, 2004, and 2005 were not available for review. As a result, it is impossible to judge how much it costs to achieve programme goals. It is also difficult to determine how much funding is available now and in the future relative to what is required for programme expansion and improvement. This is particularly true as RBM training for 2006 and 2007 has been combined with other forms of BSP training, both in budgetary terms and in practical application. What is clear is that resources have been somewhat limited. The 2003 training budget appears to have been $100,000. For the following year, a $290,000 budget was formulated as follows:17 Training workshops and ad hoc training in various field locations (travel for field staff and trainers; organization of workshops; preparation of materials; backstopping) √ Average cost: US$ 25,000 per workshop18 times an estimated 8-10 workshops for a total of US$ 200,000 Temporary assistance for managing the programme and secretarial support √ US$ 60,000 Initial development of website and CD-ROM modules for e-training and self- learning √ US$ 30,000 Resources were generated in a number of ways at UNESCO/BSP for the RBM training: through contributions from the unit’s budget, from the corporate training budget, and from indirect contributions member states (e.g. Canada contributed by providing the financial support for Baastel; Italy contributed by providing funding for the junior professional post (thus financing one staff person). In fact, only one member of the training team was paid for directly by the RBM budget. That essentially three staff members were able to develop materials and conduct approximately 30 trainings for over 400 staff worldwide on a limited budget is noteworthy. However, in the absence of more comprehensive budget information, determining the cost-effectiveness of resource utilization remains elusive. 17 UNESCO Bureau of Strategic Planning. Requests by BSP for 2004 Training Funded From Corporate HRM Funds, Submission to the Learning and Development Commission. October 24, 2003 18 Documentation provided by BSP indicates that the exact cost of the 2003 trainings in Bangkok and Quito were $57,596.25 ($3,867.00 – Equipment; $4,239.00 - Supernumerary contract; $14,147.32 - Bangkok 15-19 Sept 2003 Regional RBM Training Workshop; $15,498.43 -Quito 17-19 Nov 2003 RBM Regional Training Workshop). Source: Report: Implementation of Allocated Training Funds 2003, Excel spreadsheet. 24 4 QUALITY AND EFFECTIVENESS 4.1 Monitoring and evaluation The RBM training team relied on three key approaches to quality assurance for the training programme: 1. Using existing materials as the basis for developing a UNESCO training programme; 2. Meeting with colleagues in other international organizations and inviting the feedback of an external expert; and 3. Distributing a post-participation survey to participants First, the training programme was developed using materials in use by other UN and international agencies. This helped to ensure a general compatibility of materials with other organizations. Such materials included: 19 RBM: An Introductory Guide to the Concepts and Principles, Canadian International Development Agency (CIDA), 1999. RBM handbook on developing results chain, CIDA, 2000 Implementing Results-Based Management: lessons from the literature, CIDA, 2000 Results-Based Management and accountability for enhanced aid effectiveness, CIDA, 2002 Logic Model Development Guide, W.K. Kellog Foundation, 2001 Glossary of Key Terms in Evaluation and Results Based Management, OECD/DAC, 2002 Results Based Management Concepts and Methodology, UNDP, 2002 A guide for developing a logical framework, University of Wolverhampton, 2000 Second, members of the training team met with colleagues in other international organizations to discuss RBM implementation and training in various contexts. As mentioned previously, these meetings appear to be somewhat ad hoc and formalizing these arrangements through partnerships could add to the quality of the training programme. The BSP also invited Le Groupe-conseil baastel ltée (Baastel) to attend an early training session to ensure that the programme met recognized standards in terms of quality and effectiveness. Finally, the primary technique for evaluating the RBM training on an on-going basis was the use of post-participation evaluation surveys at the close of each training session. While post-participation surveys were circulated after the pilot trainings in Bangkok and in Dakar, it was not until the Quito training sessions that the survey form “stabilized”. Participants were asked to rate various aspects of the training sessions on a scale ranging from 1 (lowest) to 5 (highest). Specifically, they were asked to assess: 1. The content, clarity, and usefulness of each training module (“presentation”); 2. The effectiveness of working groups in the learning process; 19 UNESCO Bureau of Strategic Planning. Results Based Management Training Workshop – Windhoek, 6-8 July 2004. Workshop File. 25 3. The organization of the programme in terms of time spent on theory, hands-on work, discussions, participants, and reporting; 4. The material distributed; 5. The venue and logistics; 6. Trainers’ delivery of theoretical part of the workshop; 7. The effectiveness of group work support/coaching; 8. The availability of trainers; 9. The impact the workshop would make on participants’ work in the future; and 10. The training programme overall. Participants were also invited to offer “remarks and recommendations.” Overall, data were collected from the overwhelming majority of programme participants. In some cases, however, respondents provided only partial data. These tended to be instances in which training sessions were shorter than three days. As a result, participants were not exposed to (and therefore could not report on) all aspects of the training. Similarly, evaluation data varied as modules were added or deleted from the training. The good news is that approximately 300 post-participation surveys were collected that provided useful feedback to the RBM training. The RBM training team was able to use the data to: 1. Monitor participant satisfaction; 2. Reflect on and adjust the content of the training sessions; and 3. Incorporate statistics and highlight key comments in summary reports following training sessions. While the evaluation data proved useful in many ways, the overall approach to monitoring the training programme suffered from a handful of weaknesses. First, the post-participation survey collected important information regarding participant satisfaction but did not gather information regarding skills or knowledge acquired. Rather than a post-only design, a pre/post-test approach could enable BSP to assess whether or not participants gained skills and knowledge as a result of programme participation, and if so, in which domains. Second, the survey questions focused on important dimensions of participant satisfaction (course content, venue, availability of the trainers, etc.) – but questions were loosely or poorly worded, making interpretation somewhat subjective. For example, it is not clear what aspect of “content” participants are asked to assess. Similarly, multiple dimensions of the training programme are embedded in a single question. Specifically, participants are asked to rate the overall organization of the programme in terms of “sharing of time between theory, hands-on work, discussions, participants reporting.” Finally, the rating scale itself is difficult to use. Without references to what a “1” means or a “5” means, the scale may mean different things to 26 different people. Providing a definition for each rating category, or a narrative rating scale could help to standardize the categories. In short, careful attention to questionnaire design could enhance the usefulness of the evaluation data. Third, the evaluation data did allow the RBM training team to assess progress in terms of participant satisfaction – but the data were not clearly linked to expected results for the training programme. As a result, they do not necessarily provide the correct or complete information for monitoring and evaluation. The evaluation data could be more useful if the survey were designed in a way that corresponds to performance indicators associated with a formal work plan, which also seemed to be lacking. Finally, while evaluation data exist for the formal training programme (the full 3-day training or introductory sessions using the same material) – similar data do not exist for “coaching sessions.” As a result, a substantial number of individuals who received assistance from the RBM team did not have an opportunity to comment on their experience. Moreover, neither the RBM team nor BSP had any way to judge whether or not coaching (as opposed to formal training) resulted in different levels of satisfaction or skill-building. 27 4.2 The participant experience Table 3: Post-participation assessment of the While the BSP/RBM approach to training overall and impact on future work monitoring and quality control was Overall Impact limited, the data are useful for 2003 assessing participant satisfaction. Bangkok* 3.15 They provide a particularly useful Dakar* 3.43 Quito - Office 4.54 4.50 snapshot of participants’ immediate Quito - Regional 4.13 3.96 reactions to the training experience. Average 2003 4.34 4.24 In combination with the INDEVAL survey conducted specifically for this 2004 evaluation, the data enable UNESCO Addis 4.71 4.33 to develop a comprehensive picture Amman 4.42 4.50 Beirut 4.15 3.82 of participants’ experiences. Cairo 4.19 4.00 IOC 4.13 4.00 4.2.1 Are participants satisfied Kingston 4.50 4.27 with the training Nairobi 4.11 3.75 experience? Windhoek 4.64 4.64 Average 2004 4.35 4.18 Overall, it appears that participants were quite satisfied with the training 2005 Bangkok 1 3.91 4.18 experience. As part of the post- Bangkok 2 3.79 4.21 participation survey, they were asked Beijing 4.12 4.00 to provide a global assessment of the Bonn 4.75 4.60 training and to assess the “impact this Bureau of Budget 3.50 ** workshop will make on your work in Dakar 3.68 3.57 Doha 3.63 3.25 the future.” Both the global Jakarta 1 3.92 4.35 assessment and participants’ Jakarta 2 4.44 4.70 assessment of impact on future work Montevideo 4.00 3.81 are relatively high, with most Primary Education 4.04 4.07 assessment hovering near 4.0 on a Average 2005 3.99 4.04 scale ranging from 1.0 to 5.0. A few observations can be made. First, Overall^ 4.16 4.11 Source: RBM Training post-participation survey overall assessments improved in the * Bangkok and Dakar data are not particularly first year as the training programme comparable to other survey data** Too few data progressed from initial trainings points to generate a reliable estimate offered in Bangkok and Dakar to a ^ Excludes Bangkok and Dakar data stable programme offered in multiple field offices around the world. However, they declined somewhat in 2005. The reasons for this decline are unclear, but it might be attributable to the changes in the course content instituted in 2005. Overall satisfaction was highest in Bonn (4.75) and lowest at the Bureau of the Budget (3.50). The latter was a one-day training with relatively few participants. Notably, comments from participants were overwhelmingly positive or constructive. A sample of comments includes: …. I am glad I had the chance to participate. I am sure it would help me in my future planning of activities. It was hard work but FUN. 28 An excellent intelligent investment on my experience on my logic training, which will definitively influence my further work positively. Thank you all. …an excellent opportunity to think in a global and practical way UNESCO work and its presence at national, cluster, regional levels with concrete links with HQ Great to have this sort of workshop; Got us together to discuss our programme While the tool (SISTER) has limitations due to connectivity, the whole idea of RBM has made it much clearer and understandable how UNESCO goes about implementing its mandate. The workshop has been very useful and will certainly make life much …easier... This workshop is very useful for us especially for those who don’t have much experience on project proposal drafting to have very useful and practical exercises on the project proposal formulation. Now we have a clearer picture about UNESCO mandate (goals) structures and operational mechanisms. This is of vital importance for our future work in UNESCO. It’s very helpful. Now I know what I put down in 33C/5 draft was not very appropriate as result or performance indicator. Yes, this will be useful for our future work… The results of the RBM participant survey conducted by INDEVAL confirm high levels of satisfaction with the overall training programme. While participants do have useful, constructive criticisms of the workshops, overall they found the experience useful. Unfortunately, not only were participants in the “coaching sessions” not provided with an ex-post evaluation survey, they were also somewhat reluctant to respond to INDEVAL’s participant survey. The low response rate among individuals who attended one-day partial training/coaching sessions makes it difficult to determine their level of satisfaction. 4.2.2 Do the courses respond to participants’ needs? One way to assess whether or not the courses responded to participants’ needs is to compare participants’ prior expectations with their ex post assessment. Limited data are available on participants’ expectations. However, used in combination with survey data, it is possible to provide an indication of participants’ satisfaction with what they learned, the content of the training, and the organization of the course. First, what did participants expect to get out of the RBM training course? What did they expect to learn? Comments from one training session indicate that common expectations include:20 To improve existing RBM skills To contribute to sector and cluster work To learn to formulate, propose, and manage projects To learn to monitor and evaluate programmes effectively To clarify the concept of results and separate it from other concepts. To learn about RBM, how it works, and how to apply it to work processes 20 “Expectations of the Windhoek RBM training workshop.” Documentation provided by Bureau of Strategic Planning. 29 To learn about work plans, how to prepare them, and how to make them results- oriented To become familiar with UNESCO terminology, the C/5 planning process, and how work plans fit together Did the training programme address these and other learning needs? Data suggest that, in many ways, it did. First, an overwhelming 92% of respondents to the INDEVAL participant survey indicated that they “agree somewhat” or “agree strongly” with the statement “After the training, I felt that I had learned information and/or acquired skills that I could use for my work.” In addition, comments from the post-participation survey suggest that participants felt the course provided useful information and that they gained valuable skills. While participants’ understanding of RBM concepts seems to have increased, there is a substantial margin for improvement. On the one hand, the good news is that the INDEVAL participant survey found that approximately half of participants understand RBM concepts “very well.” On the other hand, half of participants are not that confident. In particular, 63% of those who reported that they understand RBM concepts “somewhat well” said that they found it “difficult to understand how to apply the concepts.” In many cases (35%), respondents felt that it might be because the training period was too short, while 17% found the jargon difficult to understand, or the workshop exercises to be unhelpful. In fact, despite the 12+ hours spent in working groups, many participants felt there was a need to incorporate practical examples into the training programme. Comments from the post-participation evaluation surveys include: This was a useful workshop. Next time let's have more practical examples. … It could be more effective if we have more practical exercises so that we can learn by doing. I think the workshop is very useful, especially on how to deal with UNESCO's work plan as well as work plan development of projects in general. Personally, it would be much easier for me if there are also practical examples… More true-life examples of RBM successful implementation at the Office, Sector and programmes levels orally illustrated from a problem solving approach [would be useful] Illustration with concrete examples from our work plan would have been useful. It is the first time that I participate in this kind of activity. I have found it very useful and I hope I can use it in my work. I would recommend [to] implement more examples during the course. … More concrete and relevant examples needed when explaining theories. I learned the most from the discussion. I think the session would be more useful if instead of using random concrete examples you used the actual work plan. …Some good samples of work plans could have been shared by the trainers, which could have helped the trainees. However, the template provided was very useful, especially w/regards to SISTER elements indicated in it… These comments suggest that although the working group sessions provided an opportunity to apply RBM concepts, the modules needed to incorporate more “real 30 Table 4 : Post-participation assessment of the training life” examples so that, presentations and working group prior to breaking out into small groups, Assessment of Presentations Working Content Clarity Usefulness Group participants could absorb 2003 the theoretical aspects of Bangkok Not available/comparable Data results-based Dakar Not available/comparable not management a bit better. Quito - Office 4.43 4.39 4.41 available In particular, both Quito - Regional 4.08 4.10 3.91 interviews and the Average 2003 4.27 4.25 4.16 INDEVAL survey 2004 suggest that there is a Addis 4.66 4.68 4.59 pressing need to Amman 4.18 4.29 4.36 Data communicate how (and Beirut 4.08 4.00 4.06 not why) UNESCO goals, Cairo 4.18 4.13 4.35 available IOC 4.22 4.09 4.06 which can be somewhat Kingston 4.41 4.47 4.69 abstract with a long time Nairobi 4.12 4.00 3.91 horizon, should be Windhoek 4.60 4.47 4.72 clearly formulated as Average 2004 4.30 4.26 4.36 short-term expected results. Interviews 2005 Bangkok 1 3.96 3.76 3.97 4.33 revealed that working Bangkok 2 3.93 3.74 4.04 4.15 closely with sector staff Beijing 4.33 4.23 4.29 4.29 to identify and tailor Bonn 4.48 4.29 4.66 4.50 useful examples might Bureau of Budget 3.50 3.83 2.83 help clarify this task for Dakar 3.99 3.78 3.98 3.96 Doha 3.83 3.74 3.28 3.63 staff. However, despite Jakarta 1 3.92 3.87 4.08 3.80 their desire for more Jakarta 2 4.39 4.17 4.56 4.50 practical examples, Montevideo 4.13 4.29 4.26 3.88 participants gave high Primary marks to the content, Education 3.93 4.30 4.17 clarity, and usefulness of Average 2005 4.07 4.03 4.10 4.13 the modules themselves Overall 4.17 4.13 4.20 4.13 (see Table 4).21 Source: RBM Training post-participation survey While the trainings did offer the opportunity to apply the RBM concepts during small group work sessions as early in 2003, it was not until 2005 that individuals were asked to comment on the “effectiveness of working groups in the learning process.” Table 4 shows that, overall, participants found the working groups to be quite helpful. They provided relatively high scores to both the presentations and the working groups. Scores may have been higher had additional time been dedicated to working groups and if more (or different) exercises been offered. Table 4 also indicates that the assessments regarding the content, clarity, and usefulness of the presentations declined somewhat between 2004 and 2005. Again, it 21 Recall that post-participation evaluation surveys do not exist for coaching sessions. Data refer to the classroom-based courses only. 31 is important to point out that the training programme was modified in 2005, which may explain the decline in scores. In fact, the individual “content, clarity, and usefulness” scores for the presentations added in 2005 are just slightly lower than scores for the presentation material used in previous years. Here it is important to distinguish between observable declines and meaningful declines. The substantive difference between a score of 4.08 (Content, Beirut) and 3.83 (Content, Doha) is difficult to discern. Thus, what matters more is not the magnitude of the decline but the overall trend itself. Table 5: Post-participation assessment Participants were also asked to of programme organization Program Materials Venue comment on the organization of the 2003 course, in terms of programming, Bangkok Not available/comparable materials, and venue (Table 5). These Dakar Not available/comparable aspects of the training also received Quito - Office 4.50 4.54 4.46 high marks from participants. Again, Quito - Regional 4.38 4.15 4.92 however, averages dropped slightly Average 2003 4.44 4.35 4.68 from 2004 to 2005. 2004 Addis 4.43 4.57 4.14 With respect to programming, scores Amman 4.08 4.58 4.33 hovered near 4.0. As indicated Beirut 4.24 4.47 4.76 previously, participants generally Cairo 4.69 4.69 4.46 valued the training but recommended IOC 3.86 4.00 4.14 Kingston 4.57 4.57 4.46 that more practical applications be Nairobi 4.11 4.33 2.88 incorporated. Windhoek 4.87 4.87 4.87 With respect to materials, participants Average 2004 4.40 4.55 4.38 again gave the training high marks. 2005 Many participants found the materials Bangkok 1 3.82 4.09 3.82 well-organized and useful as reference Bangkok 2 3.93 4.29 4.50 material. Comments regarding the Beijing 4.12 4.35 4.82 materials from the post-participation Bonn 4.50 4.89 4.80 evaluations in 2004 frequently rate the Bureau of Budget Data not available Dakar 3.29 3.62 3.43 materials as excellent, good, well- Doha 3.88 3.88 3.63 organised, and very useful. Specific Jakarta 1 3.90 4.10 4.30 comments include: Jakarta 2 4.38 4.38 4.38 Montevideo 3.87 4.24 4.12 I will always refer to the workshop Primary Education Data not available file as it will serve as my Average 2005 3.95 4.20 4.23 benchmark or reference manual. Overall 4.19 4.36 4.34 An excellent tool that will be very useful in the future Source: RBM Training post-participation survey Very useful, comprehensive and organised Excellent job and will be used as reference manual on RBM The file was more than useful. Perfect! Few comments were critical of the materials. They included: 32 Over-elaborated. Handbook could be designed in non-PowerPoint form. I think that the file could be organised in a better way that focuses more on remating the different categories (items) of RBM so that it sounds more connected together as a whole system, i.e. it flows more naturally. The content of the training changed somewhat in June 2005 and the materials made available to participants changed as well. Shortly thereafter, the post-participation evaluation score regarding the utility of training materials declined slightly. A comparison of the training materials provided at Windhoek (2004) and Bonn (2005) reveals a number of changes that may have contributed to this slight decline in user satisfaction. What appear to be efforts to streamline the content accomplished both gains and possible setbacks in terms of pedagogy. The gains were achieved largely through the elimination of superfluous slides, the simplification of slide content, and the introduction of a useful RBM glossary. However, in the attempt to streamline materials and make them more user-friendly, some important content was changed, diluted, arranged in a less useful fashion, or eliminated all together. For example: Changed: Notably, the definition of results-based management changed from 2004 to 2005. In fact, the 2005 definition of RBM (also used by UNFPA) differs from that presented in the 2004 training materials, the pilot materials used in Bangkok, and the handbook developed for UNESCO by RTC Consultants. It is important that UNESCO identify and commit to a definition of RBM that can be used consistently throughout the organization. A conceptual framework for RBM at UNESCO could be helpful in this regard. Diluted: The 2004 training materials justify the RBM training using institutional references (e.g. Executive Board decisions), whereas in 2005 the answer to “Why RBM” has more to do with the use of results-based management at organizations worldwide than at UNESCO. While these justifications need not be mutually exclusive, the former are more compelling from the perspective organizational commitment to managing for results and should be made clear for participants. Rearranged: In what appears to be an effort to emphasize concepts associated with results formulation, the presentation on “results chain” was moved before the presentation on the planning framework. However, as results are part of the planning framework, it seems more logical introduce the concepts of a results- chain after presenting concepts such as the goals and purpose of an intervention. Eliminated: The note/technical guide from BSP on the preparation of work plans (August 2003), the guide to keying the work plan into SISTER and the tip sheet were missing from the Bonn workshop training materials provided for review. These documents are useful and practical guidelines that staff can use as reference guides for preparing their work plans, and should be provided as part of the participant “packet.” Perhaps what is most important, however, is what did not change. Despite the fact that participants repeatedly comment on the need for more practical examples in the post- participation surveys, specific examples of RBM concepts are not integrated into the PowerPoint materials. Frequent presentation of UNESCO-relevant examples is critical to help staff understand how to translate theory into practice. Building examples into the training materials that individuals take away from the workshop 33 will ensure that they have useful, practical reference materials at their fingertips. It will also make it easier to develop a stand-alone e-course for staff to consult online. Finally, returning to the post-participation survey results in Table 5, participants seemed satisfied with the venue and logistics for most of the trainings. However, the assessment was uniquely low for the training in Nairobi. Comments from participants at this training suggested that there was inadequate notice to participants, the training may have conflicted with work schedules, and that the in-office venue was distracting. A handful of comments from other sites also suggested that choosing a venue away from the office might prove less distracting for participants. The INDEVAL participant survey found that 85% of respondents agreed somewhat (40%) or agreed strongly (45%) that the date and time of the training were convenient. 4.2.3 Are course durations adequate to ensure acquisition of RBM skills? According to the INDEVAL participant survey, approximately half of participants felt that the duration of the training programme was “just right.” But interestingly, nearly 46% felt that the programme could have been longer. Findings were similar for the post-participation surveys. In many cases, participants felt that they would have benefited from more training time, more time for practical applications, or additional time in a follow-up session. Some participants felt that too much information was conveyed in too short of a time period. Comments included the following: The duration was rather short. Hands-on work suffered due to lack of time. The time for the training was limited in terms of the practical and theory. More time required for such a workshop… Follow-up session (refresh[er] course) required after 6 - 8 months. Time available for working groups was too short. 3 days with more than 6hrs/day is too tight schedule. I think time should be at least one week for this workshop. Useful for workshop which could be longer than three days with more time to be allocated to hands training and group discussions with the trainers. [The one day training was] too short to get the details. … an information overload without sufficient time being provided to digest the information. Time too short; Content: too much compared to time Too much theory given at short timing. More time is needed in order to absorb contents of the training material. … reduce the hours of training and increase the number of days of the workshop. Some participants did feel that the training could have been shorter, but they tended to be the minority. 34 Currently UNESCO is organizing two days workshops. While this is shorter than before, the intention is to enable participants to meet the need for hands-on assistance by providing small group assistance and individual coaching with regard to work plans. 4.2.4 How effective is the training in conveying RBM concepts to participants? About half of the respondents felt that they understood RBM concepts “very well,” but 47% understood concepts only “somewhat well.” (Only 2% felt they did not understand RBM at all). Of course, self-assessed ability is likely to differ from true ability, and individuals may have some incentive to over-report their knowledge of RBM. Interviews revealed some skepticism regarding improvement in work plans, for example. Overall, however, these figures are encouraging. In instances where respondents felt their knowledge was lacking, it was largely because they had not yet mastered how to apply RBM concepts. Certainly, RBM concepts are not mastered overnight, or even in one training session. For many people, RBM is a new way of thinking that takes time to develop. Others struggle to see RBM as useful as opposed to a formality. Participants clearly indicated that more training, and practical training in particular, would be helpful. This finding does not necessarily mean that teaching RBM theory is inappropriate for UNESCO’s needs. Nor does it mean that UNESCO must fund continual training sessions for all staff. Rather, it identifies the limits of a training programme on a complex topic which must balance theory and practice. Time spent refining the hands- on component of the three-day workshop, developing a formal structure for coaching, and working with sectors to tailor specific trainings to their needs could go a long way. In addition, translating existing material into an online training programme, further enhanced with worksheets, templates, glossaries, and instruction manuals could provide a ready reference for trainees who need additional assistance or follow- up support. Not only would such practical measures facilitate organization-wide learning, but they would also enhance the sustained impact of the training programme. 35 5 RBM ACTIVITIES OF OTHER AGENCIES 5.1 Background The pace of today's globalised world means that change is a constant, and this is no different for the United Nations (UN). The demands by the Member States of the UN and its Secretariat, agencies, funds and programs have grown enormously. With a huge Secretariat of about 7 800 employees and a total staff of about 63 500 employees in the UN system (including the World Bank and the International Monetary Fund), the UN system is expected to deliver more services in more places than ever before to the world’s people who are most in need. Needless to say that innovation is required for reforming the UN system. RBM is often perceived as being critical in the reform process. This is why the World Leaders at the 2005 World Summit requested a number of landmark reports. In the second half of 2006, three further reports elaborate on this vision. One of them is “Comprehensive Review of Governance and Oversight”, a report containing far-reaching recommendations on key management processes and structures which, if approved by Member States, should redefine the way the Organization works. In particular, the report delivered in July 2006 recommends a series of improvements that affect both management and the governing structures by reinforcing RBM. It also mentions that there is strong evidence that “properly implemented results-based management provides the basis for greater transparency, more effective budgetary decision- making, and therefore improved working practices between governing bodies and executive management. This report strongly advocates the continuance and strengthening of such practices in the future.i” Thus, RBM, in the wider context of the UN system, has become a key innovation that is needed for successfully reforming the organization. 5.2 RBM at the United Nations By the late 1990s, many United Nations organizations had turned to results-based management as a tool for improving performance. Generally speaking, RBM is perceived as an innovative management approach focused on achieving results. Many organizations of the United Nations system have embarked on a process of introducing performance management systems based on RBM concepts. A chief impetus driving the adoption of RBM strategies has been the need to be able to present results that are backed by measurable indicators and evidence of achievement. More importantly, member states have been more forceful in asking for feedback on their donations and ensuring that the funds are well spent. In a time of crisis and reduced budgets, the UN has been constrained to embark on major reforms and thus focused more on results. 36 Not surprisingly, RBM is almost everywhere in the UN system today. For example, UNDP, the United Nations Population Fund (UNFPA) and the World Food Program (WFP) all use the term results-based management in most of their documents; the United Nations Children’s Fund (UNICEF) uses results-based program planning and management; the United Nations uses results-based budgeting (RBB); and the United Nations Educational, Scientific and Cultural Organization (UNESCO) refers to its results approach as results-based programming, management and monitoring. All these agencies and organizations have also developed a comprehensive program for training its field officers and staff at headquarters on RBM and its implications for strategic planning and budgeting. At the International Labor Organization (ILO), the results approach translates into strategic budgeting and at the Food and Agriculture Organization (FAO), the approach is implemented through a set of conceptual and procedural advances (Strategic Framework, New Programme Model, enhanced monitoring and evaluation regime), while rarely referring explicitly to results-based management.ii Indeed, the objective of RBM is to “provide a coherent framework for strategic planning and management based on learning and accountability in a decentralised environment.”iii RBM aims to improve management effectiveness and accountability by “defining realistic expected results, monitoring progress toward the achievement of expected results, integrating lessons learned into management decisions and reporting on performance”.iv The definition provided by UNDP is as follows: RBM is a “management strategy or approach by which an organization ensures that its processes, products and services contribute to the achievement of clearly stated results. Results-based management provides a coherent framework for strategic planning and management by improving learning and accountability. It is also a broad management strategy aimed at achieving important changes in the way agencies operate, with improving performance and achieving results as the central orientation, by defining realistic expected results, monitoring progress toward the achievement of expected results, integrating lessons learned into management decisions and reporting on performance”.v When this planning is done properly, progress can be measured and results assessed. All United Nations programs are now subject to results-based management, in which a program is accountable for delivering results to its stakeholders. RBM is not a budgetary or financial function but is in essence a team-based and participatory approach to management designed to improve program and management effectiveness, efficiency and accountability at the organizational, regional and country levels. In the past, civil servants have been only concerned with a management approach that focused on inputs and processes. Today, RBM requires them to focus on measurable changes (results) to be achieved, strategies and activities that will lead to these changes (results) and reporting on the results achieved and their contribution to reaching organizational goals. Overall, RBM also seeks to balance expected results with the resources available. In doing so, RBM becomes a tool for planning, monitoring progress regularly and adjusting the activities as needed to ensure that the desired results are achieved. Results-based management, therefore, is an approach that integrates the management 37 of strategies, resources, activities and information about performance, with a view to improving effectiveness and accountability, and achieving results. RBM can be applied in planning, monitoring, evaluating and reporting on any type of program. To make it successful, a consensus exists that RBM has to be introduced at all level of the organization. The RBM approach must thus be applied in all organizational units and programs at regional and local levels. Indeed, stakeholders are often more interested in knowing where and how the money is spent in the various country offices rather than funding at headquarters. Still, all organizational units are expected to define results for their own work, which will also contribute towards achieving the overall results defined for the organization. These units include: - Program management - Financial resource management - Information management - Human resources management - Strategic management 5.3 Implementing RBM in different organizations: Lessons learned 5.3.1 The UNFPA experience Background UNFPA is committed to addressing issues of population and development, reproductive health, gender equality, and women’s empowerment that will contribute to the ultimate goal of improving the quality of life and sustainable development. One of the leading organizations within the UN system on RBM, the Fund is committed to strengthening both program and internal management so as to perform its functions effectively and efficiently to achieve results. In partnership with governments and civil society, UNFPA is seeking to further strengthen its field operations and global advocacy, and to effectively deploy its resources through a results-oriented approach. However, similar to other UN organizations or agencies, the RBM framework could be a more effective tool for monitoring country level program performance if project staff were to have a better understanding of the approach. UNFPA defines a result as a describable or measurable change in state that is derived from a cause and effect relationship.vi UNFPA systematically focuses on results to ensure that financial and human resources are strategically deployed to achieve the greatest impact. To do so, managers take the lead in ensuring that RBM guides all staff, bearing in mind the diversity of situations in which they work and the role played by the Fund’s partners in achieving results. Training and experimentation have been adopted at all levels of the organization and particularly at country office level. In RBM at UNFPA, the country office is accountable for achieving results identified in the country program, in close collaboration with national partners. 38 Training The Fund is undertaking various measures to “institutionalize RBM” in its programming and management operations both at headquarters and in the field. Several of these measures focus on increasing the requisite technical capacities of UNFPA’s professional staff in RBM. Included among such measures was the Results- Based Management Workshop for the Country Support Team (CST) Members in Montreal, Canada in January 2001. The Country Support Team Advisers who participated in this workshop were expected, subsequently, to train their colleagues in the application of RBM to their work and particularly in support of country programs. Training is also provided at country level in evaluation and monitoring as well as in RBM. Three-day regional workshop have been designed and given For instance, it is in this context that CST Addis Ababa conducted an internal seminar/workshop for its professional staff in May 2001. The title of the workshop was “CST Addis Ababa and RBM: the role of CSTAA in operationalising RBM at the country level”. Some resource persons were drawn from the Ethiopia field offices of UNICEF and USAID, and from the Africa Youth Alliance project in Uganda. There were also participants from UNFPA Field Offices in Central African Republic, Chad, Ethiopia and Uganda. All CSTAA Advisers were encouraged to make contributions/presentations during the seminar on how best to plan, monitor and evaluate program interventions at field level in order to more effectively achieve results, and to report more clearly on results; special emphasis was placed on the requisite methodologies and tools. The methodologies/ tools were presented and extensively discussed. Where necessary, improvements to the tools were made during working group sessions with a view to developing the “CSTAA RBM toolkit”. The presentations and discussions that deal specifically with monitoring and evaluation, and, therefore, that had direct relevance to the development of the CSTAA RBM toolkit, were the main basis for the development of the workshop. The tools about which full agreement was achieved during the workshop, and that have been fully developed, have been field-tested by advisers and national partners. Experimentation is therefore perceived as a critical component of the implementation of RBM. Feedback obtained from the application of the tools is used to refine and improve them. The RBM tools are shared with other CSTs and with UNFPA headquarters, and might eventually be universally applied to country programs. Country support teams have also developed their own competencies in RBM by dedicating resources to the field of RBM. For instance, in the Bangkok office, a staff member is responsible for the provision of technical assistance to enable UNFPA country office staff and national counterparts to improve monitoring and evaluation activities in the context of results-based program management, including identification and retrieval of indicators through conducting training workshops on RBM, the logical framework approach, and monitoring and evaluation for national executing and implementing agencies, and UNFPA CO staff. UNFPA has also contracted with consulting firms to provide expertise in RBM. For instance, a Canadian-based firm recently provided training to UNFPA headquarters and regional staff in results-based management and performance review methods. The 39 training sessions made use of RBM tools and concepts adapted to the UNFPA mission and programming context. 5.3.2 UNDPvii Background The notion of results and “managing for results” is not new to UNDP. UNDP policy calls for RBM principles to be applied at all levels of activity, from the headquarters level down to the individual project level in country offices. The impetus for more results was given by the various UNDP “clients” or partners, calling for more transparency in the way projects were funded and evaluated. UNDP has not always been able to demonstrate these results effectively to the full satisfaction of donors. Therefore, the RBM system being put in place responded to this concern by setting out clear program and management goals for the organization and establishing indicators to monitor and assess progress in meeting them. For many years, the organization has been working to get things done and produce results, but its emphasis was mostly on managing inputs and activities. Indeed, UNDP’s mandate and range of activities was often seen as too broad to have a successful performance framework based on results. Moreover, UNDP was seen essentially as a broad funding agency focusing on institution building with no real substantive position on development issues. In early 1998, a decision was taken by UNDP to develop a framework for the measurement and assessment of program results. The Evaluation Office was assigned the lead responsibility to work on the design of such a framework. This step initiated the introduction of results-based management (RBM) in UNDP. As the discussion on future funding was translated into the first Multi-Year Funding Framework (MYFF), RBM became part of an exceptionally unified management effort coordinated by the Bureau of Management to ensure that a performance framework was in place in UNDP. Since 2000 the responsibility of coordinating the RBM process in the organization has been assigned to the Operations Support Group. Key components of the results system at UNDP include both planning and reporting instruments. All the training is concentrated in these areas. The key planning instrument is the Multi-Year Funding Framework (MYFF) linked to the Evaluation Plan. As argued by many, the MYFF represents a milestone for UNDP, as for the first time in the organization’s history, the funding strategy is based on the identification of clear results and the establishment of an integrated resource framework (IRF) that incorporates and presents in a transparent manner all financial allocations covering programs, program support and administrative operations.viii Similar to other results frameworks introduced in the public spheres over the last ten years, the MYFF is based on the premise that public institutions can no longer lay claim to public resources on the basis of mandates alone, rather they have to outline specific program and services and fundamentally demonstrate impact. The evaluation plan prepared by all operating units allows the organization to be kept up to date continuously, annually or periodically depending on local needs. As such, in essence, it is a ‘rolling’ plan. 40 The main RBM reporting instruments is the Results-Oriented Annual Report (ROAR) and the Multi-Year Funding Framework Report (MYFFR). The ROAR is the principal instrument for reporting on a yearly basis at both the country and the corporate level on the entire range of UNDP activities, whereas the MYFFR is a more in-depth assessment of performance relating to the outcomes and outputs identified in the MYFF. It is produced every four years. Training Most of the training and experimentation has been dedicated to the MYFF and RBM over the past years.ix Implementing the MYFF and the RBM culture has been seen as a learning process that cannot be done over a short period of time. Any system needs to be seen as a work in progress, evolving over a considerable period of time and incorporating flexibility to make changes as experiences are gained. Training has also been continuously offered at headquarter and country office levels in the area of Monitoring and Evaluation (M&E), which are the essential functions of the financial management cycle but are difficult to install as that involves critical qualitative assessment of performance and corrective measures which can be problematic to implement. Evaluation and Monitoring is conducted from quantitative inputs at headquarters and country office levels within the MYFF and ATLAS frameworks. The RBM is therefore quite integrated at UNDP. For instance, if indicators are not well defined, the MYFF reporting is not done properly and the evaluation becomes impossible. In such cases, impact assessments of development spending are rare as well. Due to generally weak strategic planning and the limited management resources and budgets for M&E, as well as lack of common understanding of the RBM culture, the evaluation function has evolved somewhat erratically. While donor support for UNDP is unquestionable, notably by the increased move towards un-earmarked funding, the call for a clear RBM system and more reliable and regular information on projects and results is equally resonant. As a progress reporting tool, the MYFF may be sufficient but without an extensive RBM framework it is impossible to measure intangible outcomes. Various workshops have been offered in initiatives undertaken to compare RBM systems, the distinction being often made between managing by results and managing for results. The former is principally oriented towards accountability and external reporting; the latter focuses on a cycle of planning, periodic performance assessment and organizational learning.x In implementing RBM, UNDP made a deliberate decision to emphasize learning. This was based on an unequivocal message from donors that RBM must explicitly aim at changing the way the organization is managed, fostering a strategic orientation and culture of performance. Two-day workshops on RBM have been offered in recent years. The workshops focus on the Results Chain and are mostly done in small group exercises. Issues included are: Impacts and Outcomes, Logical Framework Analysis, Goals, Purpose and Outputs, Performance Measurement and Indicators, Indicator Development, Assumptions and Risks. 41 RBM training has been offered in all countries of operations. In Africa, for instance, the training focused on capacity development in a) Performance Management b) Results Based Budgeting and c) Monitoring and Evaluation. UNDP Zimbabwe trained government officials in the concept of the Results Based Management (RBM) to improve service delivery of the government agencies. For the Arab States, over the period 2004-2005, UNDP also conducted training workshops on RBM and public budget management, targeting 250 participants (Deputies, Parliament personnel and Ministries representatives) as well as an awareness raising workshop on RBM for NGO representatives. UNDP is also providing extensive RBM capacity building training for governments. In Albania, for instance, the government lacked a tradition of Results Based Management. Priorities were not properly quantified in terms of measurable indicators or properly programmed, monitored and evaluated, with planning, monitoring and reporting systems and frameworks being disconnected from each other. UNDP has been supporting the strengthening of national capacities through the Institute of Statistics (INSTAT) and line ministries to collect, analyze and report on development data through implementation of RBM systems, and training of relevant authorities for effective application of the systems for national and regional development, reporting, planning and distribution of resources. Training has also focused heavily on the use of indicators. Within the RBM framework, UNDP uses at least three types of results indicators: 1. Situational (impact) indicators, which provide a broad picture of whether the developmental changes that matter to UNDP are actually occurring (impact indicators and situational indicators are essentially the same, although the former may be more specific and the latter may be more generic); 2. Outcome indicators, which assess progress against specified outcomes; 3. Output indicators, which assess progress against specific operational activities. UNDP staff have been extensively trained on the development and use of indicators. Major drawbacks on implementing RBM Many UN organizations have introduced the MYFF, the multi-year funding framework for reporting results. UNDP and UNFPA, for instance, implement a MYFF which describes the strategic goals and service lines to be pursued by the organization, and details the organizational strategies that will be followed over the MYFF period. Based on the empirical evidence of program choices being made on the ground by program countries, and linked to the global consensus reflected in the Millennium Declaration and the Millennium Development Goals, the strategic directions proposed in MYFF define a common ground where the two converge. The UNDP MYFF was introduced in 2000 as a critical tool to articulate and to sharpen the priorities of the organization and to develop means of achieving its objectives within programming cycles. It is the most comprehensive report on the performance of UNDP, aggregating data from over 140 program countries and reporting on performance in all the practice areas. The MYFF also provides a succinct analysis of the efficiency with which UNDP financial resources have been used. The 42 UNFPA MYFF was also introduced in 2000. The second MYFF, which is currently under implementation, defines UNFPA contributions to development results, indicates how progress will be measured and identifies the strategies to be used to attain results. The MYFF report presents information on about 120 program countries, highlighting progress made to achieve national population and development objectives. It also indicates how UNFPA has contributed in particular areas of its mandate to support national partners and presents a set of robust indicators on UNFPA organizational effectiveness. Although the MYFF is often perceived as an innovation, it can still remain an obstacle in implementing the RBM culture. First of all, UNDP and UNFPA maintain individual MYFF-based reporting arrangements which differ from each other in several important respects.xi For example, the UNDP reporting arrangements are based on a conceptual distinction between the achievements of UNDP per se in a program country, and the achievements of the program country itself. In the UNFPA approach, reporting on the MYFF interprets country performance using a set of development results associated with the mandate of the organization. Additionally, there are significant differences in the data systems that each organization has put in place for reporting. Harmonization of procedures is therefore critical to allow proper management of resources across organizations. Furthermore, MYFF should not be interpreted as a RBM tool since it is not an adequate system for performance reporting. MYFF is more a traditional budgetary monitoring and evaluation system focusing more on ensuring the spending of the budget than on service delivery. As a result, MYFF reporting systems are not tuned for quick reporting and the widespread implementation of complex automated accounting and performance systems is difficult in an environment of poor M&E technical skills and budget constraints. Following a study made by the author,xii reporting under the MYFF is not easy for the following reasons: - The MYFF does not necessarily capture all key results (i.e. busy officers have a tendency not to collect data when indicators are not well defined) - Regional and local program are not always being captured by the MYFF database and country officers seem to confuse core results for reporting - Country officers note there is a definite lack of clarity as to what each driver meant for work in a particular service line. Furthermore, incorporating certain development drivers in the context of situations can be particularly challenging and the reporting difficult (i.e. gender or human rights issues) - There is lack of precise indicators to measure performance or when they exist they are too subjective. For instance, what does it tell you to say that projects are “achieved”, “partially achieved” or “not achieved”? As a result, many of the MYFF reviewed for this report introduce a lack of clarity about who can claim success and/or accountability for outcomes and implementation. There is a need today to have a system that includes a strong program of monitoring, 43 including capacity development to gather and analyze information for impact assessment of public investments (based on outcome indicators for instance). 5.3.3 The World Bank Over recent years, the World Bank has been strengthening the results focus of its operations. The 2002 International Conference on Financing for Development in Monterrey and the joint statement issued by the heads of the multilateral development banks (MDBs) highlighted the need to better measure, monitor, and manage for results. In 2003, the Bank started implementation of its Managing for Development Results Action Plan. Within the Bank, a key element is the Results-Based Country Assistance Strategy (RBCAS), which was piloted and then mainstreamed in 2005. The RBCAS includes a results framework that links country development goals with Bank operations and forms the basis for Country Assistance Strategy (CAS) M&E. The Bank also introduced results frameworks into projects and sector strategies: staff outline their expected outcomes and explain how they are to be achieved.xiii It is important to note that the Bank introduced results frameworks at the project level for most of its investments in 2004. A results framework focuses on the project development objective and intermediate outcomes, both of which are to be supported by performance indicators. These indicators would be used to track progress toward meeting the development objective and to make changes in the project, if necessary, during implementation. Indeed, a few years after its implementation, country directors and task team leaders can use the results frameworks to establish the extent to which the Bank can be held accountable for producing results. According to the 2006 Annual Report on Operations Evaluation, 25 results frameworks were prepared during fiscal year 2005–06 at the country level. However, the analysis confirmed weaknesses in the approach, mostly poorly articulated results chains and lack of indicators with baselines and targets. It is argued that these weaknesses would lessen the usefulness of the results frameworks as country program monitoring and management tools.xiv As a result, training has been seen as a major component for reinforcing results-based culture in the Bank’s operations. Various areas of training needs have been identified, including: 1. Developing specific country development goals 2. Identifying the nature and extent of the Bank’s contribution to achievement of the goal 3. Implementing framework for monitoring intermediate outcomes, actions, or outputs that serve to mark progress toward achievement of the CAS outcomes 4. Developing performance indicators that are measurable and include baselines and specific, time-bound targets in a strong results framework. 44 According to the Annual Report, there is strong incentive to accelerate a results- oriented training and communications program for management and staff to encourage use of M&E information. The Monitoring and Evaluation Action Plan for 2001-2003 also introduced results-based M&E in pilot countries and provided training and technical support to increase the Bank’s own capacity to manage for results. Experimentation has been a key component of introducing the RBM culture at the Bank. However, it is argued in few evaluation reports that the effort was too fragmented at the beginning and that not enough funding was made available. In a very recent external survey conducted in 2006, which queried 2,759 external clients from governments, international organizations, bilateral donor organizations, non- governmental organizations (NGOs), academia, and the general public, 46 percent of the surveyed respondents asserted that more training should be provided to improve the capacity of the Independent Evaluation Group. It is also noted that such training should be provided to “front-line” staff in order to improve the overall capacity to deal with results management.xv It should be noted that during the 2006 year, the Operations Policy and Country Services issued a review note of some 50 country programs providing guidance to teams on modalities and current practice of results oriented program reviews. The note examines the evolution of Country Portfolio Performance Reviews and their growing importance as a management tool and a tool for policy dialog. The note also argues for the need to strengthen country programs through guidance to individual teams and facilitating cross-country learning, particularly on results management. The World Bank is also strengthening its support of results-oriented capacities in all client countries. Such efforts have been particularly evident in middle-income countries in Latin America, where the Bank, the Latin America and Caribbean Region, and the Inter-American Development Bank are supporting a network of M&E system managers. It should be noted that the World Bank also offers an International Program in Development Evaluation Training in collaboration with Carleton University. This four-week course program covering the field of RBM has been run for the sixth time in mid-2006. In addition, IEG is providing condensed versions of the training in various countries such Trinidad and Tobago, India, and Africa. The training has also been provided in China in October 2006. The strategy aims to learning priorities and proposes two new initiatives: managing for results training module and a Results self-assessment tool. 5.4 Conclusion: Summary of key findings Bureaucracies are not immovable. With persistence and proper design, the RBM culture can indeed be successfully implemented. However, getting past the barriers of change is often difficult. The introduction of RBM is always challenging when proper training and/or experimentation is not done in the right manner. To implement a comprehensive RBM program, UN organizations need better leadership, and 45 leadership can be taught. When managers have been fully engaged, and hungry for insights into how they might become better managers, they fully embrace the new culture brought by the RBM framework. Still, large organizations such as the UNDP or UNFPA find it harder to embrace breakthrough innovations such as results-management as it may cost a lot of money, senior managers finding it difficult to shift resources and systems in a radical way. In such large UN agencies which have a tendency to stabilize resources for too long, the group tends to become insular, less creative and less bold. Thus, it still appears difficult to adopt a bold RBM agenda over short period of time. Other reasons exist for not properly introducing RBM, including lack of senior management support, poor communication between country offices, poor system planning and architecture, weak partnerships and governance, etc. Managing technology is also a key component for successfully introducing the RBM approach. With technologies, it is now possible to perform a greater number of experiments and monitoring in an economically viable way to accelerate the drive toward the reporting of results. However, technology cannot be seen as the only criteria of success. Leadership and human capital needs to be properly handled. Most RBM implementations require the necessary impulse from the senior management at the highest level. At UNESCO, for instance, the culture was originally not oriented on rapid change but the training mechanism put in place by the Bureau of Strategic Planning has facilitated the process for being innovative and bringing RBM methodologies. Indeed, to achieve effective RBM innovation, an organization requires a number of critical role-players to collaborate in a formal or informal team relationship. These critical roles include the idea generator, the project manager, several types of information gatekeepers, and the project/team sponsor. Central to introduction of RBM is the use of proper models, controlled environments, networking and reporting to management or member states, etc., that allow the UN “innovators” to reflect on and evaluate the many ideas that are proposed. As a result, constant experimentation and training is required in order to fuel the discovery of the RBM approach. However, successfully implementing radical RBM culture goes beyond the concept of trial and error. It calls a process of recruiting talents, knowledgeable external experts, implementing proper personnel development and training, and especially performance measurement and rewards. For instance, civil servant staff should be properly rewarded for implementing new and often complex RBM frameworks. Without incentives, they would be tempted to use the existing systems. Only through continuous RBM training will the creation of knowledge lead to the development and improvement of programs from a result perspective. It is also essential to keep the approach simple. RBM should not lead to an increased workload. The number of instruments must be limited and easy to understand. Indeed, of critical importance to the successful installation of an RBM system is the recruitment and training of staff. Such training is available and applies at country level as well, recognizing that needs exist for both RBM, M&E and IT skills to keep a reliable system operative on such a large scale.xvi 46 6 IMPACT OF THE TRAINING PROGRAMME The impact of the RBM training should be assessed relative to the overarching programme goal (to affect the formulation of expected results in the C/5 documents) and to the expected results outlined in the programme work plans. Ideally impact would be assessed using a “counterfactual”, or the state which would have occurred if no training had taken place. This would involve comparing the utilization of RBM and the quality of work plans among training participants with UNESCO staff who did not participate in the RBM trainings. The abilities of staff could also be assessed before and after the training (the two approaches are not mutually exclusive). However, due to time constraints and a lack of baseline data (both at the level of individuals and at the organization level), these approaches were not possible. Instead, the evidence of impact comes from three sources: 1. Findings from the desk review of official documents 2. Self-reports by staff on the RBM survey conducted by INDEVAL 3. Findings from key-informants interviews This chapter reviews evidence from each source, in turn. In combination, the evidence suggests that the RBM training has made important contributions to enhancing the quality of reporting at UNESCO, has enhanced awareness of RBM within the organization, and contributed to skill-building among staff. However, there continue to be opportunities for growth and improvement. 6.1 Evidence from official documents An examination of statements from official UNESCO documents reveals a somewhat mixed view regarding RBM. There is some evidence that the formulation of expected results and the application of RBM concepts has improved over time. In some cases these improvements are attributed, in part, to the RBM training programme. The statements also suggest that despite positive changes, a substantial margin for improvement continues to exist. It is important to note, however, that expectations regarding the utilization of RBM at UNESCO mostly likely changed over time. Therefore, the persistence of some critical commentary is to be expected. Official statements occurring both before and during the RBM training period are included. 6.1.1 Year 2000-2001 Critical comments 1. “Knowledge on the relationship between output(s), outcome(s) and impact(s): The validation confirmed that programme specialists are sometimes confused about the relationships between outputs, results, outcomes and impacts (as highlighted in other parts of this report). This confusion negatively impacted on the quality of reporting (and is now being addressed through systematic results-based management (RBM) training arranged by BSP).” 47 2. “Limited experience in reporting of results: Fairly early on in the information collection process, it became clear that the type of reporting that was being received from the sectors and services in their respective Form 1s was weaker than expected. The Results-Based Management (RBM) training that was given to members of the Secretariat by BSP towards the end of 2001 was geared at the preparation of document 31 C/5 and came too late to have a significant impact on the implementation.” Cite: Report of the Director-General on the activities of the Organization in 2000- 2001, communicated to Member states and the Executive Board in accordance with Article VI.3.b of the Constitution 32 C/3, 2002, page 280 6.1.2 Year 2002-2003 Encouraging comments (These comments suggest that changes were underway prior to the training) 3. “We observed the progress being made by UNESCO in setting out expected results along with performance indicators as seen in 32 C/5 Draft compared to previous C/5 documents, including the material on the Institutes. We found the related IIEP documents to match these improvements by setting out IIEP expected outcomes. And reporting results against expectations as found in 32 C/3 has also improved.” Cite: Report by the External Auditor on the Performance Audits Undertaken in the 2002-2003 Biennium, 169 EX/29 [attached to 33 C/INF.8], March 19, 2004, page 34, paragraph 115. 4. [Follow-up of previous report recommendations] “The logical structure and stability of MLAs, paragraph 189. … The MLAs of both IHP and IOC are becoming more stable….” Cite: Report by the External Auditor on the Performance Audits Undertaken in the 2002-2003 Biennium, 169 EX/29 [attached to 33 C/INF.8], March 19, 2004, page 39, paragraph 141. 5. “Expected results: The formulation of expected results has improved for document 31 C/5 when compared to document 30 C/5; this could be already attributed to the training in RBM conducted by the BSP. However, several of the expected results which the C/3 team tried to verify were rather vague, being formulated using terms such as “capacity strengthened”. Without any indication as to the original capacity, it is not possible to verify whether capacity has indeed been strengthened. Further, the bulk of the expected results refer to “what was undertaken” using descriptions such as: “donate”, “provide”, “support” and “organize”. The submission by Sectors on training were particularly problematic in that few of them actually indicated what the beneficiaries of the training had been successful in doing with their new or updated knowledge and skills. However, the situation is expected to improve as document 32 C/5 has included 48 performance indicators. BSP needs to continue working with the Sectors to improve their performance in formulating expected results.” Cite: Report of the Director General 2002-2003, 33 C/3, 2005, page 66. Critical comments 6. “Better reporting on results is required. Having noted progress, significant improvements to the reporting of results are still required, as UNESCO admits. Expected results for the Institutes are sometimes unclear and frequently are: • statements of activities to be carried out and sometimes the outputs to be produced, rather than outcomes expected to be accomplished, • not presented as part of a chain of results anticipated to occur, • focused only on the two year biennium period, • without any targets so as to be able to know if the expectation has been met, • without any base line data setting out the current level of the results expected.” Cite: Report by the External Auditor on the Performance Audits Undertaken in the 2002-2003 Biennium, 169 EX/29 [attached to 33 C/INF.8], March 19, 2004, page 34, paragraph 116. 7. “In UNESCO, we found no clear definitions and explanations of such terms as ‘expected results’ and limited guidance on how to report results. The practice of public reporting on results is gradually maturing across many jurisdictions and principles for good reporting are emerging. And in the Institutes, practice and interpretation of how to set out expected results varies.” Cite: Report by the External Auditor on the Performance Audits Undertaken in the 2002-2003 Biennium, 169 EX/29 [attached to 33 C/INF.8], March 19, 2004, page 34, paragraph 117. 8. “Implementing results-based management is a challenge and the Institutes’ staff would benefit from appropriate training and assistance.” Cite: Report by the External Auditor on the Performance Audits Undertaken in the 2002-2003 Biennium, 169 EX/29 [attached to 33 C/INF.8], March 19, 2004, page 35, paragraph 118. 9. “UNESCO and the Education Institutes should continue to improve the statements of the C/5 expected results, keeping in mind recommendation 189 of 165 EX/29/Add.” Cite: Report by the External Auditor on the Performance Audits Undertaken in the 2002-2003 Biennium, 169 EX/29 [attached to 33 C/INF.8], March 19, 2004, page 35, paragraph 122. 49 10. “Evolution of RBM: The application of the RBM logic in reporting has increased. Colleagues are more familiar with the idea of ‘result’. There is, however, still room for improvement. The introduction of the use of performance indicators in document 32 C/5 should help in this regard. Also, it would be useful to develop a set of common indicators for assessing the performance of the programmes for each of the five functions.” Cite: Report of the Director General 2002-2003, 33 C/3, 2005, page 63. 11. “Performance indicators: Performance indicators are generally absent or vague, both in document 31 C/5 and in the activity sheets submitted by the Sectors. Document 32 C/5 contains a number of performance indicators and it is understood that they will be used more extensively in document 33 C/5. This will go a long way in assisting the programmes to properly define various dimensions of their expected results. Here also BSP will have to continue working with the Sectors in formulating indicators.” Cite: Report of the Director General 2002-2003, 33 C/3, 2005, page 66. 6.1.3 Year 2004-2005 Encouraging Comments 12. “…The application of the RBM approach continued to expand and become more refined. All the expected results given in document 32 C/5 for the programme sectors also had performance indicators, a distinct improvement over document 31 C/5. It should be noted that in document 33 C/5 the RBM approach was further improved by adding benchmark targets to the performance indicators. The submissions made in the MLA/Unit Forms contained a number of good results. There is, however, room for improvement, as the familiarity and understanding of staff members with the practice of RBM needs to be strengthened...” Cite: Joint Report by the Director-General on the Implementation of the Programme and Budget (32 C/5) and on Results Achieved in the Previous Biennium 2004-05 (Draft 34 C/3), 174 EX/4, March 17, 2006, page 62. 13. “The current work plans for the Approved Programme and Budget for 2004-2005 (32 C/5), including their expected results and associated performance indicators, are being continuously improved as a result of the RBM training. For the first time, all Main Line of Action entries now show expected results.” Cite: Report by the Joint Inspection Unit (JIU) of Interest to UNESCO and the Status of Implementation of Recommendations of Previous Reports and on Results Obtained, 171 EX/38, February 28, 2005, page 6. 14. [Expected result] “Results-based programme planning, monitoring and evaluation approaches refined and applied in programme and budget documents, reports on 50 the implementation/Training in RBM developed and offered at Headquarters and in field offices/Assistance provided to field offices in results-based management (RBM) and work plan management” The following results were obtained: (i) RBM skills, in particular results formulation, broadened throughout the Organization through training and coaching programmes offered to about 200 staff members in field offices and Headquarters since January 2004. A set of reference and training materials was developed. Feedback indicates a positive impact of the workshops on staff capacities, as is also reflected in improved quality and breadth of results formulation in document 33 C/5; (ii) Expected results and performance indicators of 32 C/5 work plans continuously improved” Cite: Information Concerning the Implementation of the Programme and Budget for 2004-2005, 33 C/INF.3, September 29, 2005, page 97. Critical comments 15. “A final issue of consistent weakness identified by the evaluations is that of RBM. … the specification of expected outcomes and the content of internal/external reporting emphasizes the completion of tasks and activities to the exclusion of intermediate and final outcomes. This limits the usefulness of results-based data for decision-making purposes. While each institute/centre should take steps to improve RBM practices, there is a need for more guidance and training by the UNESCO Secretariat.” Cite: Biennial Evaluation Report on the Activities and Results of all UNESCO Decentralized Bodies, 174 EX/20, February 23, 2006, page 9-10. 16. “…The second tranche of evaluations confirmed a general need for strengthening of RBM practices and, consequently, there is a need for a major programme of RBM training for other institutes/centres.” Cite: Biennial Evaluation Report on the Activities and Results of all UNESCO Decentralized Bodies, 174 EX/20, February 23, 2006, page 10. 17. “…The verification exercise found that entries on strategies were often mixtures of background information, delivery mechanisms, expected results and activities to be carried out. The justifications for given activities were sometimes unclear. Often what appeared as expected results were actually the activities to be undertaken by UNESCO. Also, the performance indicators often refer to activities such as “number of workshops organized”, and do not refer to the results expected. The most common challenge encountered was with “capacity-building”. It was rarely possible for the C/3 team to verify whether capacity had indeed been strengthened by/through the various training workshops in the absence of baseline information. Hence there is room and need for further staff training, already under way.” 51 Cite: Joint Report by the Director-General on the Implementation of the Programme and Budget (32 C/5) and on Results Achieved in the Previous Biennium 2004-05 (Draft 34 C/3), 174 EX/4, March 17, 2006, page 63. 18. “…With regard to the RBM approach there is an urgent need for better internalization and application of concepts such as strategy, activities, results and results chains, performance indicators, benchmark targets, delivery mechanisms and modalities, and criteria for assessing performance. Efforts to address the challenges noted above, particularly the issues with weak monitoring of programme implementation, are urgently needed. BSP together with IOS will work on this with a view to continue improving both RBM skills and monitoring among staff, and will relate them to the growing challenges to UNESCO to contribute to country-level programming exercises, such as CCA, UNDAF or PRS.” Cite: Joint Report by the Director-General on the Implementation of the Programme and Budget (32 C/5) and on Results Achieved in the Previous Biennium 2004-05 (Draft 34 C/3), 174 EX/4, March 17, 2006, page 63. 6.2 Evidence from the INDEVAL participant survey Training programme participants generally found the training to be informative and helpful for skill-building. While a handful of individuals found the training not helpful at all, they are a minority. The results of the INDEVAL participant survey suggest that the training has enhanced staff orientation to RBM, while also building skills in the area of programme planning. Was the training programme an unqualified success? Certainly, improvements could be made, but positive impact in the following areas should be acknowledged: Ability to formulate results: According to participants’ self-reports, the RBM training programme made a positive impact on their ability to formulate expected results for a work plan. Specifically, 56% of respondents indicate that their ability to formulate results improved “somewhat” as a result of the training programme, while a notable 35% indicate that their abilities improved “very much” as a result of the programme. Only 9% of respondents felt the training programme had no impact on their ability to formulate results. While it may be the case that respondents are inclined to overstate their professional skills, most (61%) respondents found the task of formulating expected results for a work plan to be “moderately easy.” Nearly one-third of survey respondents indicated that they found developing expected results to be moderately difficult (25%) or very difficult (4%). Ability to formulate performance indicators: Participants’ assessments of their skills with respect to performance indicators, and the impact of the training programme on those skills, are nearly identical to their assessments in the area of 52 results. The majority of respondents (91%) give the training programme some credit for improving their skills either a great deal (29%) or somewhat (62%). Again, while most trainees find formulating performance indicators to be very easy (6%) or moderately easy (59%), approximately one-third suggest that they find the task at least moderately difficult. Utilization of RBM concepts in work processes: An encouraging finding from the INDEVAL participant survey is that 74% of respondents apply RBM concepts to work processes other than developing work plans – important for mainstreaming RBM throughout the organization. Participants have extended what they learned to a wide variety of work tasks, ranging from managing staff to writing funding proposals to organizing conferences and workshops. In this regard, the training programme appears to have had a positive impact on the way staff think about results. As one key informant suggested during an interview, training appears to have helped people in the process of reasoning and increased the precision in the way they think about their work. The survey results are encouraging and suggest that continued training and support can have additional and positive impact on results formulation. The programme appears to respond well to an organizational need and it has been well received, particularly by field office staff. Overall, the participants are all highly favourable to continued support in the area of RBM. The desire to keep the project going through ongoing follow-up is a recurring comment. BSP could build on the desire for follow-up to the RBM training by taking advantage of information technology. As noted previously, developing an intranet site for results-based management could provide needed resources to staff while simultaneously enhancing the sustainability of the training programme. At present, only a Results Based Management Guide is available on-line. It introduces a number of fundamental concepts that underpin RBM but this document, drafted by RTC Consultants in Ottawa, is not used by the RBM training team. Fortunately, the BSP RBM team has recently developed a “guiding principles” document that can be used as a launching pad for follow-up assistance. As such, a comprehensive communication and dissemination strategy should be developed - one that ideally includes web-based distribution. 6.3 Evidence from key-informant interviews Key-informant interviews were conducted at headquarters in Paris, where individuals articulated interest in and enthusiasm for results-based management but also some skepticism about the impact of the training programme itself. On a positive note, interviewees seemed to feel that awareness of results-based management at UNESCO had improved notably over the last few years. Other positive comments included: The use of RBM concepts has genuinely improved over the last few years; The RBM training was critical in the transition to the new format for the C/5; The training provided a clear understanding of the added value of using RBM for strategic planning; Colleagues indicated that they enjoyed the training; and 53 Coaching builds ownership of RBM in a way that general training does not. On the other hand, in some cases, there was a lack of clarity regarding the exact nature of the RBM trainings, when and where they were held, and who was involved. A formal communications strategy for informing senior staff about the training programme was lacking and could help in this regard. In discussing the training program, concerns were raised regarding a number of organizational constraints that may dampen the impact of the RBM training efforts. They included: Sector-specific needs: Interviewees were concerned that a one-size-fits-all approach to RBM training does not take into account the important differences across sectors and the challenges associated with framing results for programmes that aim to have a positive but somewhat abstract, hard-to-measure, and long-term impact. Monitoring and evaluation: Interviewees were concerned that insufficient emphasis is being placed on using RBM to help staff monitor outputs and outcomes. One interviewee highlighted the link between developing a solid work plan with well-formulated expected results and indicators, using that work plan to monitor outcomes, and building ownership of RBM. In particular, learning how to use the work plan as a professional tool can build ownership in RBM concepts. However, without attention to monitoring, developing a work plan can seem just like an exercise. Difficulties with SISTER: While it is not within the scope of this evaluation to assess the strengths and weaknesses of SISTER, interviews revealed concerns that the system may not facilitate successful use of RBM concepts, particularly where the level of detail in SISTER is burdensome for staff and does not necessarily add value for planning or monitoring. The role of the budget in the planning process. There was some concern that the role of the budget in the planning process affects utilization of RBM concepts. Specifically, the driving role of the budget envelope in the planning process means that, in some cases, aligning the right activities with the right MLAs to achieve a specific set of results can be a challenge. Despite these concerns, there was a general sense of commitment to make training effective and a willingness to collaborate with BSP to do so. 54 7 MAJOR FINDINGS AND RECOMMENDATIONS Overall, this evaluation paints provides a relatively positive assessment of the RBM training programme launched and implemented by UNESCO’s Bureau of Strategic Planning between June 2003 and November 2005. Approximately 400 individuals around the world received both comprehensive and personalized assistance from a relatively small staff with a sizeable objective. The evaluation revealed the following programme strengths: RBM culture has been enhanced. UNESCO has been moving towards results- based management (RBM) by introducing policy and programme changes to improve the quality and increase the impact of its programmes. The introduction of the RBM training is an important factor in this move as evidenced by the use of RBM concepts in work plans and other work activities, as well as the improved formulation of results in the C/5 noted by the Executive Board. A strong RBM training programme is important for the continuous improvement of the C/5 and the work plans. Participants are generally satisfied with the training programme. The staff of UNESCO that have been trained demonstrate a general satisfaction with the RBM training as well as initiative regarding the application of RBM concepts. Generally, participants feel more knowledgeable about the formulation of their inputs into the planning process. Specialized assistance on RBM is valued. Staff appear to value the opportunity to learn about results-based management. At the same time, they seek personalized assistance and opportunities to gain hands-on experience with RBM concepts as they apply to work plans and other planning documents. This is provided and valued in the formal RBM training sessions. The RBM team complements other programming services provided by BSP. The mix of formal training and coaching helps bridge theory and practice. Results-based management concepts are not always easy to understand. For many, RBM is a new way of thinking about programme planning, implementation, and assessment of progress. Offering training that provides a solid overview of RBM theory, as well as opportunities for hands-on work – either through group work or through personalized coaching – bridges theory and practice. While the overall assessment is largely positive, the training programme does face important challenges that should be noted and are relatively easy to address. The major challenges and associated recommendations fall into two categories: 1) programme management and 2) training content and implementation. Programme management Coherent and predictable operation of the RBM training programme is critical for long-term success. Despite hard work and enthusiasm, important aspects of managing the programme fell short, hampering its effectiveness. In particular, creation and/or approval of key programme planning documents such as an approved work plan and budget were delayed and formal mechanisms for supporting the RBM training team were somewhat weak. There is a need for some 55 work process changes to ensure the sustainability of the training programme over time. ⇒ Recommendation: (1) Develop, maintain, and monitor a comprehensive work plan for the RBM training programme, complete with performance indicators and a communications strategy for informing field offices and headquarters staff about the availability of RBM training. Limited human and financial resources may weaken the long-term sustainability of the RBM training programme. Although an in-depth assessment of resource availability and utilization was not possible, the budget for the training programme appears to have been quite tight (as evidenced by the fact that many staff were not charged directly to the program) and the staffing pattern of essentially three full-time individuals appears to have been somewhat light. Moreover, the RBM team has been constrained in developing a sustainable programme that can easily withstand changes in staffing. Looking forward what is needed is stability, coherence, and predictability of resources. ⇒ Recommendations: (1) A Financial Sustainability Plan could be prepared for the next budget cycle. This document would assess the key financing challenges facing training programme, and would describe the unit’s approach to mobilizing and using financial resources to support medium- and long-term programme objectives. (2) UNESCO/BSP could undertake a systematic approach regarding funding for RBM training. A “Donor Reference Group,” comprised of some delegations, could be explored and would allow donor governments to exchange views on RBM culture at UNESCO and this arrangement could be seen as a forum for donor pledging as well. Training content and implementation Participants’ skills have improved but these skills need to be enhanced to increase correct use of RBM concepts. Despite the generally positive assessment offered by participants, it becomes clear – both from self-reports, interviews, and official documentation – that additional skill-building is needed to help staff formulate results for work plans and to develop performance indicators correctly. Both interviews and survey data highlighted the importance of follow- up to refine skills and the willingness of staff to participate in ongoing training. In addition, while the training programme received “high marks” from many participants, opportunities exist to strengthen the content. It is clear from the comments for the post-participation evaluations and from the INDEVAL participant survey that staff members seek opportunities to practice applying RBM concepts to specific work tasks. Very few individuals complained that the training content was too theoretical. However, many indicated that practical exercises were lacking. 56 ⇒ Recommendations: (1) Continue to offer the RBM training programme, but ensure that training needs are clearly assessed and materials are targeted to those needs prior to embarking on another round of trainings. More than one set of training materials could exist (e.g. introduction to RBM applications, intermediate follow-up training materials, advanced “train the trainer” materials). (2) Review the training materials with an eye to practicality. Update materials with new examples and exercises. The integration of practical, UNESCO- specific and sector-specific examples in the PowerPoint training slides is critical. Identify individuals, offices, sectors, who are succeeding in RBM applications. Work with them to identify “what works” and “what to avoid” and incorporate that feedback into the examples and exercises for the training programme. (3) Provide formal, structured, targeted training (e.g. coaching) to staff involved with formulating results at the MLA-level, drawing on both the RBM training staff and well-trained BSP sector liaisons as coaches. (4) Offer the three-day formal RBM training sessions at headquarters, which has only received coaching sessions and partial trainings to date. (5) Provide self-learning tools for staff, perhaps by developing a comprehensive intranet site that uses the existing PowerPoint materials to develop an “e-learning” course, complimented by other practical materials such as templates, worksheets, and reference materials. Manage and update the site regularly. If resources permit, the e-materials could also be turned into a CD-ROM for distribution in field offices that have bandwidth constraints or other Internet access challenges. (6) Incorporate RBM training documentation (perhaps from the e-learning materials described above) into the online orientation materials currently provided to new staff. For RBM to be applied effectively, training must take “intra-UNESCO” differences into account. Not only do data suggest that the training sessions need to have a more practical (“how-to”) orientation, but that specific attention needs to be paid to the different approaches that different sectors must take to formulating expected results. In particular, interviews repeatedly highlighted the fact that some sectors seek to achieve long-term outcomes that are difficult to measure. ⇒ Recommendations: (1) Collaborate with sector-staff and BSP sector liaisons to tailor the training materials to sector-specific circumstances by modifying examples, exercises, and handouts depending on the audience and their needs. Quality control and monitoring of programme outcomes could be enhanced. Efforts were made throughout the duration of the training programme to monitor important aspects of quality, particularly participant satisfaction. Post- participation evaluations were useful for adjusting programme design but could have been enhanced substantially to improve quality management. Weaknesses in 57 evaluation design hampered the ability of BSP to monitor changes in learning and behaviors as regards RBM. ⇒ Recommendations: (1) Collect comprehensive and reliable data on programme participants in order to know the characteristics of who participates. (2) Use a pre/post survey design with a carefully constructed instrument to assess both participant satisfaction and changes in knowledge. (3) Collect post-participation data from all RBM trainees, including coaching sessions, to provide BSP with useful information about the different categories of assistance being offered. (4) Link data collection with performance indicators associated with a well- developed work plan to enhance programme monitoring and enable more strategic mid-course adjustments if necessary. (5) Enhance quality control by participating in (or building) expert networks, particularly within the UN system. This will ensure that UNESCO remains aware of and contributes to best practice in the field, and that RBM trainers have opportunities to build their skills. Finally, BSP can take advantage of two important opportunities to strengthen the RBM training programme, to enhance the mainstreaming of RBM at UNESCO, and to build ownership of RBM concepts. There is an opportunity to enhance the mainstreaming of RBM throughout organizational work processes. When asked if they use RBM concepts in work processes other than developing work plans, 74% of survey respondents said they did so. They offered a comprehensive list of applications, ranging from meeting participation to proposal writing to staff performance assessment. In addition, interviews reveal support for a more ambitious RBM culture at UNESCO, particularly to improve the reporting of results and the monitoring of programme activities. Unfortunately, the lack of a UNESCO framework document regarding the definition and application of results-based management throughout the organization is likely to have hampered the mainstreaming of RBM. ⇒ Recommendations: (1) Develop a conceptual framework for RBM at UNESCO to be applied throughout the organization that promotes common understanding and concepts, and that clearly extends the vision for RBM beyond the formulation of results for the C/5. BSP is expected to release guiding principles for RBM at UNESCO which could make a valuable contribution in this regard. (2) Recognize and reinforce use of RBM concepts throughout the organization by incorporating “profiles of success” in training materials, staff newsletters, and on an RBM intranet web site. Creating an environment that acknowledges and supports the successes and difficulties associated with applying RBM concepts can help build ownership and mainstream RBM throughout UNESCO. 58 There are opportunities to build ownership of RBM: There is some evidence from the INDEVAL participant survey that ownership of RBM concepts is growing at UNESCO. However, it has also been suggested that staff feel that their work plans are reworked by BSP and are thus disconnected from the planning document. Because there is no formal system in place that links the formulation and the monitoring of work plans, staff need not reflect on the work plans regularly and may not see work plans as a useful tool. As a result, the RBM training may not take hold to the greatest extent possible. Staff need to feel ownership of their work plans and their expected results in order to see the utility of the training and to use it. ⇒ Recommendation: (1) Extend the use of RBM to monitoring and evaluation. “Monitoring for results” training might be considered an intermediate RBM course. If resources are not immediately available for providing such training, BSP might consider offering workbooks, templates, or other instructional materials to help staff further develop their ability to identify and use performance indicators. 59 ANNEXES 60 8 TERMS OF REFERENCE Terms of Reference (TOR) for the Evaluation of HRM Sponsored RBM Training at Headquarters and in Field Offices January 2006 8.1 Background Information Background: Since 2003, HRM, through the Learning and Development Commission has been supporting training in RBM at Headquarters and in Field Offices. The 2004 – 2005 biennium represents the first full biennium in which training in RBM was conducted, and this evaluation will thus be the first evaluation of RBM training in UNESCO. 8.2 Purpose of the Evaluation Purpose and objectives of the evaluation: The evaluation is to assess HRM-supported and UNESCO/BSP-run training in RBM for programme specialists and other staff members at Headquarters and in Field Offices, with respect to: (i) the quality of the RBM training, and (ii) its effectiveness in terms of the extent to which the training has contributed to improving RBM practices in UNESCO. The evaluation is also to identify pertinent issues to be addressed, and make proposals for improving training in RBM in UNESCO. Scope of the evaluation: Overall, the evaluation will address the period from June 2003 to November 2005. It will cover participants from both Headquarter Units and Field Offices, and where necessary examine relevant capacity building in RBM related issues undertaken by the various sectors and in the field offices. Evaluation questions: In line with worldwide UN reforms and the adoption of Results Based Budgeting and Management, and in particular the use of RBM in UNESCO, relevant questions (indicative but not exhaustive) to be answered by the evaluation would therefore include: Policies and Management What were / are the operational policies in place with regards to training in RBM covering such details as the selection of staff for training, the number of different categories of staff trained and their Sectors? Do senior management, supervisors, and other staffs know about these policies? What are the experiences of senior management, supervisors and other staffs in working with these policies? What are the advantages (strengths and opportunities) of these policies? What are the disadvantages (weaknesses and threats) of these policies? In reality, what were /are the practices with regards to the implementation of the said policies? Do senior management, supervisors, and other staffs know about these practices? 61 What are the experiences of senior management, supervisors and other staffs with the said practices? What are the advantages (strengths and opportunities) of these practices? What are the disadvantages (weaknesses and threats) of these practices? Quality and effectiveness of the RBM training courses What is the quality of the programmes being taught in terms of content, presentation, support material, etc.? Are the needs of participants identified before the start of the courses and do the courses respond to their needs? Are the RBM training courses adequately timed to suit the schedule of the participants and to ensure the greatest impact on acquisition of RBM skills and its use in the organisation for major activities such as completing the C/5 and the work plans? Are the durations of the courses adequate to ensure acquisition of RBM skills to the levels required? Are the courses adequately evaluated to ensure that the required quality is being provided and maintained? How qualified and efficient are the trainers in terms of qualification, experience, methods of delivery of the courses, etc.? How does the RBM training offered by UNESCO compared with training offered by other UN agencies? Impact of the RBM training Do the RBM training courses offered respond to the RBM requirements of the organisation? To what extent have the concepts learnt being applied to the work of the sectors and units, as reflected for instance in the work plans, and submissions for the EX/4, EX/5 and C/3? Possible options for UNESCO Given the strengths, weaknesses, opportunities and threats identified with the current policies and practices with RBM training at UNESCO, the issues gleamed from other UN Agencies, and the assessment of the quality and effectiveness of the training, what possible options can be envisaged for further development and improvements (where necessary) for training in RBM? Why? 8.3 Procedures and Methods Overall Approach: The process will require a combination of multiple and complimentary evaluative strategies. The evaluators should develop an approach that collects both quantitative and qualitative data and seeks to make the evaluation itself a learning process for all parties involved. Building on these Terms of Reference, the evaluators should elaborate their overall approach and methods to be used, including 62 an evaluation matrix. It is anticipated that the evaluation will be organized into successive and partially overlapping phases: (1) Document review and analysis, (2) Data and information collection, and (3) Overall analysis and proposals. Document review and analysis: Phase I will focus on document review and analysis and will lay the foundation for the work to follow. The core set of documents to be reviewed will include: General Conference, and Executive Board documentation with regards to RBM, UNESCO RBM manuals, Course Descriptions, Training material, Outputs from practical exercises, Details of arrangements for backstopping participants after the training, RBM manuals of at least two other UN agencies, Assessments of the participants by the trainers, and Assessments of the training courses by the participants. Logical Framework Analysis: By the end of phase I of the evaluation, the consultant would be expected to conduct a logical framework analysis or similar exercise (where necessary) with HRM and major stakeholders with the view of getting a common understanding on the objectives of the RBM training in UNESCO and the criteria to be used for the assessments in the evaluation. Data and Information Collection: Data and information would have to be collected from UNESCO and, if necessary, from some UN and international agencies. Data collection methods would include a suitable combination from among: Interviews with key players, The use of questionnaires, Observation of some training sessions (where/when possible), and Group discussions. Individual interviews and discussions should include selected delegations at HQ, the Directorate, HRM, Management and staff on the sectors, a sample of past and present participants in the RBM training courses, BSP and IOS. Overall analysis and proposals: The overall analysis will form the basis for discussing possible options for policies and practices that would be recommended from the evaluation. The evaluators will assume the overall responsibility for the analysis and proposals. Management of the Evaluation: IOS/EVS will manage the evaluation and would provide necessary support to both HRM and BSP with respect to this evaluation, covering such details as assistance with the drafting of the necessary Terms of Reference (TOR), assistance with identifying a suitably qualified external evaluator (consultant), and review and comments on the evaluation plan, the draft report, and the final report. HRM will be responsible for contracting the suitable evaluator identified and providing the necessary logistical support to the external evaluator. 63 8.4 The External Evaluator / Consultant Selection of consultant: The consultant will be selected from among a minimum of three qualified and experienced professionals. The consultant should have a professional background and/or extensive experience in (a) Evaluation; (b) Good knowledge of RBM, (c) knowledge of UNESCO’s programmes and processes, including the use of RBM and (d) fluent in English or French, with a good working knowledge in the second language. 8.5 Evaluation Budget Budget: The estimated resources available to carry out the evaluation correspond to approximately 45 person days of professional time. The consultant will have to be self sufficient as regards logistics (office space, administrative and secretarial support, telecommunication, printing of documentation etc.) although office space will be provided for time spent in UNESCO Headquarters. 8.6 Timeframe The evaluation is to be completed by the end of March 2006. The following are broad schedules for the evaluation: Item deadline / period 1: Selection of external evaluator / consultant: December 2005 2: Document review and analysis: January 2006 3: Inception report: 30 January 2006 4: Interviews, analysis, proposals: January 2006 – February 2006 5: Submission of draft final report: 28 February 2006 6: Review of draft final report by stakeholders: 15 March 2006 7: Submission of final report: 30 March 2006 8.7 Deliverables of the Evaluation - Inception report including the Evaluation Plan - A draft final report with findings (achievements and challenges), lessons learned and recommendations - A final report with findings (achievements and challenges), lessons learned and recommendations 64 9 METHODOLOGY 9.1 Introduction This section contains the methodological framework and guidelines used to evaluate of the Results-Based Management Training offered by UNESCO’s Bureau of Strategic Planning. In particular, it addresses the following topics: The overarching goal of the evaluation and related research objectives The phases of the evaluation activities The specific research activities undertaken, the data collected, and the analysis conducted The project deliverables and related timetable Limitations of the analysis 9.2 Methodology INDEVAL adopted a systematic approach organized into three phases. Each phase represented a block of work enabling subsequent tasks to be carried out efficiently. 1. Inception phase – This phase consisted of preliminary consultations; compiling a bibliography of documents for review; preparing guidelines for semi-structured interviews; drafting a participant survey; meeting with UNESCO evaluation point- of-contact; identification of candidates for interviews and discussions; preparation of the inception report; and logistical planning. 2. Evaluation phase – This phase consisted of reviewing key documentation; collecting data through semi-structured interviews, discussions, and correspondence; finalizing and fielding the participant survey; and reviewing and formatting RBM post-participation evaluation survey data for analysis. The evaluation phase is described in more detail below. 3. Data evaluation/assessment and reporting phase – This phase consisted of identifying, summarizing, and reporting on themes and trends identified in documentation, through interviews and discussion, analysis of survey data, and analysis of post-participation evaluation survey data. 65 9.2.1 Evaluation Phase 184.108.40.206 Part I. Document review The evaluation began with a review of relevant literature and UNESCO RBM documentation. The purpose of the document review was to provide an understanding of the RBM training programme, its history, its current context, and potential impacts. The primary documents reviewed included: General Conference documentation with references to RBM and RBM training Executive Board documentation with references to RBM and RBM training UNESCO reports with references to RBM and RBM training UNESCO RBM manual UNESCO RBM training material Assessments of the training courses by the participants Learning and Development Commission documents Data on total trainings delivered, when, where, and number of participants Any documents related to the design and initial objectives of the training Budget and resources utilization information concerning the training For a complete list of documents reviewed, please see the attached bibliography. 220.127.116.11 Part IIa. Key informant interviews Following the documentation review, the evaluation team conducted key informant interviews with senior management, the RBM training team and other high-level staff associated with the training programme and/or evaluation at UNESCO, country delegations, and one consultant. The interviews focused on RBM training policies, structure, and integration of the knowledge into work processes. Approximately 20 semi-structured interviews were completed at UNESCO headquarters. The goal of the interviews was to gather a wide variety of opinions regarding the training programme from key stakeholders. These interviews were guided by a set of key questions (Annex 4). The information gathered was used to develop an understanding of: The history and context of the RBM training programme and policies The elements of the RBM training programme and its implementation Opinions regarding its strengths and weaknesses Opinions regarding its impact Staff recommendations for improvements A list of individuals participating in the interviews is listed in Annex 3. There was no opportunity to conduct field visits to discuss the role and impact of RBM training, or to observe a training session. 18.104.22.168 Part IIb. Survey of RBM training participants. Assessing the impact of the training on participants’ knowledge, skills, and work behaviors was an important part of this evaluation. As part of the document review, the evaluation team reviewed the post-participation evaluation surveys completed by 66 participants. Post-participation survey data were provided in electronic form by BSP. Data entries were assumed to be correct. Minor editing was done on qualitative comments to correct spelling errors. In addition to analyzing the post-participation survey data, INDEVAL conducted a second participant survey to assess the utility of the training for participants’ current work practices. The survey focused on how the participants experienced the training, their opinions about the effectiveness of the training, and how they use the information (see Annex 5). The sampling frame for the participant survey consisted of all UNESCO staff who received RBM training between June 2003 and November 2005, based on a list of participants provided by BSP. Excluded from the sampling frame were individuals who attended RBM training but who did not work for UNESCO. This exclusion was based on the assumption that the expected results of the training (learning how to apply RBM concepts in a UNESCO context) would not apply to these individuals. Individuals who participated in more than one training session appear in the sample frame only once. They were assigned to the longer of their multiple trainings based on the assumption that the longer experience best reflects their learning experience and post-participation knowledge base. Finally, individuals for whom no email address was available were also excluded from the sampling frame. To facilitate the speed at which the survey could be fielded, collected, and analyzed, INDEVAL fielded the survey online using the commercial service provider SurveyMonkey.com. An email inviting RBM training participants to take part in the survey, as well as two follow-up email reminders, were sent from UNESCO IOS in order to 1) minimize the likelihood that the invitation would be delivered to a bulk mail folder, 2) increase the likelihood individuals will read the email, and 3) increase the likelihood that they will respond to the survey. Emails were sent to 450 individuals, of which: 72 emails bounced back as undeliverable, indicating that these individuals no longer worked for UNESCO. In combination with the 12 individuals for whom no emails existed, this produced a coverage error of 18%. 7 individuals did not recall participating in the workshop, suggesting a potential discrepancy between the attendance figures provided by BSP and the actual number of individuals trained 94 individuals responded to the survey, producing a response rate of 25%. 67 Table 6: Sampling Frame for INDEVAL RBM Participant Survey TRAINING NO. YR LENGTH LOCATION TREATMENT Included Bangkok regional 59 2003 1 day Field Partial training Dakar sub-regional 7 2003 3 days Field Full training Quito regional 11 2003 3 days Field Full training Quito office 19 2003 3 days Field Full training Bangkok office 11 2003 1 day Field Partial training Dakar office 15 2003 1 day Field Partial training Addis Ababa sub-regional 7 2004 3 days Field Full training Nairobi office 11 2004 3 days Field Full training Cairo office 16 2004 3 days Field Full training Kingston office 14 2004 3 days Field Full training Amman office (+ 3 Ramallah) 14 2004 3 days Field Full training Beirut office (+ 3 Ramallah) 18 2004 3 days Field Full training Windhoek office 14 2004 3 days Field Full training IOC Partial training 2 2004 1 day HQ Partial training IOC Coaching 21 2004 2 days HQ Coaching Science Coaching 43 2004 1 day HQ Coaching Jakarta office 1st workshop 13 2005 3 days Field Full training Jakarta office 2nd workshop 12 2005 3 days Field Full training Dakar office 12 2005 3 days Field Full training Bangkok office 1st workshop 13 2005 3 days Field Full training Bangkok office 2nd workshop 18 2005 3 days Field Full training Doha office 8 2005 3 days Field Full training Montevideo office 21 2005 3 days Field Full training Beijing office 18 2005 3 days Field Full training Bonn office 10 2005 3 days Field Full training Teacher Training January 2005 9 2005 1 day HQ Coaching Culture Coaching 13 2005 1 day HQ Coaching Primary Education 15 2005 1/2 day HQ Partial training Bureau of Budget 6 2005 1 day HQ Partial training Total Included 450 Excluded Ramallah, Palestinian Authority 15 2004 1 day Field Partial training Jordanian National Commission 2 2004 Amman training Full training UNRWA 1 2004 Amman training Full training National Commission (Lebanon) 2 2004 Beirut training Full training Duplicates - Bangkok regional 2003 13 Duplicates - Bangkok office 2003 2 Duplicates - Dakar office 2003 4 Duplicates - IOC 2004 11 Duplicates - Quito regional 2003 1 Without email address 12 Total Excluded 63 Total attendees (inc. duplicates) 513 Total UNESCO staff trained 462 Sampling frame 450 68 22.214.171.124 Part IV: Cursory comparison with another UN agency The terms of reference for this evaluation indicate that UNESCO is eager to learn how its RBM training programme compares to that of other agencies. However, the budget and timeframe for this evaluation did not permit a complete comparative study of the UNESCO RBM training programme as well as a similar activity at another agency. Thus the analysis is brief and only seeks to highlight major elements of comparison using information readily on the web. 9.2.2 Data Analysis and Reporting Phase Information collected from each phase of the evaluation was reduced and summarized by themes associated with the research objectives. The four major themes are: 1. Policy and management of the RBM training, 2. Quality and effectiveness of the RBM training 3. Impact of the RBM training 4. Options regarding future RBM training The analytic tools used include: Producing descriptive summaries of findings from the document review by theme Extracting and summarizing themes from notes based on interviews, discussions, and correspondence Statistical analysis of the RBM participant survey and post-participation evaluation data Table 7 provides a summary of the research questions addressed in this evaluation, the corresponding sources of data, and the analytic methods used. 69 Table 7 : Evaluation Research Methods Summary Data Sources Analysis Research Question(s) DR I&D ER PS ET SA To what organizational needs does the RBM training respond? X X X What operational policies are in place with regards to training in RBM covering such details as the selection of staff for training, the number of X X X different categories of staff trained and their Sectors? POLICY & MANAGEMENT Have these policies changed over time? X X X Do senior management, supervisors, and other staff know about these X X policies? What are the experiences of senior management, supervisors and other X X staff in working with these policies? What are the (dis) advantages of these policies? X X Is there a discrepancy between policy and practice? Why? X X How is the RBM training programme implemented, in terms of staffing, X X X materials, training schedule, participant selection, etc.? How much does it cost to provide RBM training? X X X What is the quality of the RBM training programme in terms of content, X X X materials, and staff? Do the courses respond to participants’ needs? X X X QUALITY AND EFFECTIVENESS Are participants satisfied with the training experience? X X X Are the courses timed to suit participants’ schedules? X X X X Are the courses timed to ensure the greatest impact on acquisition of RBM skills and their use in major activities such as completing the C/5 X X X and the work plans? Are course durations adequate to ensure acquisition of RBM skills? X X X X Are the courses adequately evaluated to ensure that the required quality X X X X is being provided and maintained? How effective is the RBM training in conveying the key RBM concepts to X X X X participants? How does UNESCO’s RBM training compare with training offered by X X X other agencies? To what have the concepts learned been applied to the participants’ work X X X X activities? IMPACT To what extent have the concepts learnt being applied to the work of the sectors and units, as reflected for instance in work plans and X X X submissions for the EX/4, EX/5, C/3? To what extent has the training contributed to improving RBM practices in X X X UNESCO? Data Source Codes: (DR) Document Review, (I&D) Interviews and discussions, (ER) Evaluation (data) review, (PS) Participant Survey Analysis Methodology Codes: (ET) Extracting themes, (SA) Statistical analysis 9.3 Timetable The timetable for this evaluation was quite short. The evaluation exercise extended from May 2nd to July 7th 2006, with the bulk of the data collection occurring between May 17th and June 16th. The timetable for work and deliverables was: Document review: Beginning May 2nd Inception report (Methodology document): Deliverable May 19th Key informant interviews and discussions: Beginning May 17th Fielding of participant survey: June 6th with a deadline of June 16th 70 Analysis: May 29th to June 16th Draft report: Deliverable June 19th Comments received on draft report: November 27 Final report: Deliverable December 31st 9.4 Limitations of the analysis The major limitation of this methodology is that it relies on a purely observational design. Without employing a “control group” of some sort (e.g. a matched comparison group), it is not possible to assert that participation in the RBM training caused any particular outcomes. Moreover, it will be difficult to separate out potentially confounding factors that could explain positive or negative outcomes in the area of RBM utilization, such as staff time constraints, problems with the SISTER reporting system, changes to participants’ work plans by individuals who did not participate in an RBM training, etc. In addition, reliance on self-reported skill assessment may somewhat overstate the increase in abilities associated with training. Second, the methodology relies heavily on in-depth interviews and discussions with key informants. Here, the selection of informants can bias the findings. Efforts were made to ensure that overly critical or overly positive individuals were not omitted from the interview schedule. Third, the evaluation was conducted during a very short time frame. As a result, there was no time for extensive development and testing of the participant survey. The survey window was relatively short, possibly compromising response rates. Thus, the data collected may be less useful in terms of quantity and quality than would otherwise have been the case. Despite these limitations, the evaluation is able to describe the current state of the RBM training programme and related policies, to describe how it is being implemented, to assess the quality of training materials, to assess the level of participant satisfaction, to evaluate participants’ self-reported use of RBM concepts in their work processes, and describe the level of stakeholder satisfaction with perceived outcomes 71 10 LIST OF PERSONS INTERVIEWED 1. John Parsons, Director, Internal Oversight Service 2. Alaphia Wright, Internal Oversight Service 3. Hans d'Orville, Director, Bureau of Strategic Planning 4. Jean-Yves Le Saux, Deputy Director, Bureau of Strategic Planning 5. Bruno Lefevre, formerly responsible for the RBM Training Unit, Bureau of Strategic Planning (RBM Team) 6. Tobia Fiorilli, Associate Expert, Bureau of Strategic Planning (RBM Team) 7. Maria-Victoria Benavides, Associate Expert, Bureau of Strategic Planning (RBM Team) 8. Anne-Claire Keller, Consultant, Bureau of Strategic Planning (RBM Team) 9. Othilie du Souich, Programme Specialist, Focal point for knowledge management, networking and SISTER, Bureau of Strategic Planning 10. Geneviève Rouchet, Chief, Training and Career Development Section, Bureau of Human Resources Management 11. Markus Voelker, Senior Training Officer, Training and Career Development Section, Bureau of Human Resources Management 12. Yolande Valle, Director of the Bureau of the Budget 13. Rene Zapata, Chief of Executive Office, Social and Human Sciences 14. Mireille Jardin, formerly Chief of Executive Office, Executive Office, Natural Sciences Sector 15. L. Anathea Brooks, Liaison Officer, Coordination and Evaluation, Executive Office, Natural Sciences Sector 16. Julia Hasler, Executive Office, Natural Sciences Sector 17. Min Jeong Kim, Executive Office, Education Sector 18. Paola Leoncini Bartoli, Chief of Executive Office, Executive Office, Culture Sector 19. Kwame Boafo, formerly Chief of Executive Office, Communication and Information Sector 20. Philippe Bâcle, Président, Le Groupe-conseil baastel ltée 21. Dindin Wahyudin, First Secretary, Indonesian Delegation to UNESCO 22. Dominique Levasseur, Political Advisor, Canadian Delegation to UNESCO 23. Bernardo Aliaga, Programme Specialist, Secretariat of the UNESCO Intergovernmental Oceanographic Commission 24. Georges Poussin, Chief of Section, Division of Arts and Cultural Enterprise 72 11 GUIDING QUESTIONS FOR SEMI-STRUCTURED INTERVIEWS The following questions were used to guide the semi-structured interviews and discussions. The questions were used to provide a broad and flexible structure to each interview. Different questions were asked depending on the interviewee. Interviewees were also able discuss topics not covered by the questions below. Policy and Management 1. What is your role with respect to the RBM training? 2. In your view, what organizational needs should the RBM training address? 3. How did the RBM training evolve? a) When did the RBM training programme begin? 4. What are the goals of the RBM training? a) Have the goals changed over time? 5. How does the RBM training fit into the larger corporate learning plan? 6. What corporate learning policies exist with respect to RBM training? a) When were the policies for implementing the RBM training developed? b) Have they changed over time? Why? How? c) How do you communicate the policies throughout UNESCO? 7. Are you familiar with how the RBM training programme is implemented? 8. What do you see as some of the training programme’s strengths? 9. What do you see as some of the training programme’s weaknesses? 10. [For BSP] How is the RBM training programme implemented? a) How is staff made aware of the RBM training opportunity? b) How are participants selected for the training? c) How is the training schedule determined? d) Are individuals able to attend more than one training? e) Is the implementation of the RBM training programme codified in any formal policies? If yes: What are they? How have they changed over time? How do you communicate the policies throughout UNESCO? What do you perceive to be the strengths and weaknesses of the current policies? Is there a gap between the policies and actual practice of RBM training? How great is the gap? What accounts for the gap? 11. How much does the RBM training programme cost to implement? 12. What is your impression of the resources that UNESCO has available for the RBM training programme in terms of staff, money, and time? 73 Quality and Effectiveness 13. How were the RBM trainings designed? a) How often are they revised? 14. How were the RBM trainers selected? b) Are they trained? 15. What is your perception of the quality of the training programme? c) What information do you use to draw these conclusions? 16. What mechanisms does BSP use to measure and maintain quality? 17. When are the trainings evaluated? d) How? e) When and how is the evaluation data used? 18. Have you participated in RBM training? f) Can you describe your experience? g) How did you view the quality of the training? h) What criteria are you using to evaluate quality? i) Were you familiar with RBM before the training? j) Did you feel that your learning needs were assessed before the training? k) How effective do you think the training was in imparting the skills you need to use RBM practices at UNESCO? Impact 19. Do you think the trainings have had a positive impact? a) How do you come to this conclusion? 20. Do you think the training has contributed to improving RBM practices in UNESCO? 21. What factors may inhibit/facilitate the success of the RBM training programme in achieving a positive impact on the formulation of work plans? Recommendations 22. If you could make a few recommendations to improve the RBM training programme, what recommendations would you make? Why? 74 12 INDEVAL RBM TRAINING PARTICIPANT SURVEY Survey conducted for UNESCO between June 6 and June 16, 2006 Sampling frame 462 Missing emails 12 Bounce backs 72 Refusals 7 Contacted 371 Reponses 94 Rate 25% Response rate original frame 20% 12.1 Email Inviting Participation in the Survey Email subject: Your opinion matters: Help evaluate the BSP/RBM Training Programme Dear Colleague: We understand that you participated in a Results-Based Management Training Course offered by UNESCO’s Bureau of Strategic Planning. This training programme is currently being evaluated by a team of external consultants. As a participant in the training programme, your feedback is an extremely valuable part of this evaluation process. Please, take a few minutes to complete the following survey to tell us about your experience with the RBM training programme. We recognize that you are busy, so every effort has been made to keep the survey short. It should take approximately 5 minutes to complete. The survey is anonymous. The survey deadline is Friday, June 16th 2006. However, we ask that you take the opportunity to complete this survey as soon as possible. Thank you in advance for your participation. To complete the survey click here: www.surveyurl.com Respectfully, Geoffrey Geurts Evaluation Specialist The UNESCO Internal Oversight Service 75 12.2 Email Reminder for Participant Survey Email subject: REMINDER: Help evaluate the BSP/RBM Training Programme Dear Colleague: Your opinion is valuable. As you may recall, the Results-Based Management Training Course offered by UNESCO’s Bureau of Strategic Planning is currently being evaluated by a team of external consultants. As a participant in the training programme, your feedback is an extremely valuable part of this evaluation process. Please take a moment to complete the survey, if you have not done so. The survey must be completed by Friday, June 16th 2006. We recognize that you are busy, so every effort has been made to keep the survey short. It should take approximately 5 minutes to complete. The survey is anonymous. Thank you in advance for your participation. To complete the survey click here: www.surveyurl.com Respectfully, Geoffrey Geurts Evaluation Specialist The UNESCO Internal Oversight Service 12.3 Survey instrument and summary results22 Thank you for taking the time to complete this important survey. We understand that you participated in training offered by UNESCO’s Bureau of Strategic Planning on the topic of Results-Based Management (RBM). The purpose of this short survey is to help us learn about your experience so that we may assess the management, quality, and impact of the training programme. Your participation is very valuable and greatly appreciated. Your answers to this survey are anonymous. Thank you in advance for your time and participation. 22 Figures may not add up to 100% due to rounding error. Response distributions calculated excluding missing data (except for questions 13 and 14). In order to protect anonymity and enhance usefulness, qualitative comments have been paraphrased, condensed, summarized, translated, and edited for typographical errors where necessarily. 76 1. How many times did you participate in RBM training? (n=91) Survey Actual* Actual^ Once 86% 93% 96% More than once 14% 7% 4% *Includes Bangkok regional 1-day session in 2003 ^Excludes Bangkok regional 1-day session in 2003 2. In what year did you participate in RBM training? (n=94) Survey Actual* Actual^ 2003 12% 29% 19% 2004 33% 36% 41% 2005 55% 35% 40% *Includes Bangkok regional 1-day session in 2003 ^Excludes Bangkok regional 1-day session in 2003 3. What was the duration of the training programme? (n=91) Survey Actual* Actual^ Half day 9% 3% 4% One day 3% 34% 25% Two days 15% 5% 5% Three days 73% 58% 66% *Includes Bangkok regional 1-day session in 2003 ^Excludes Bangkok regional 1-day session in 2003 The results of the introductory questions in the survey suggest that while the response rate of the survey was acceptable (25%), the substantial rate of nonresponse caused the survey to be over-representative of individuals who attended: 1. More than one RBM training (perhaps because they were more likely to be interested in RBM and to recall the training; alternatively individuals may confused RBM training with another UNESCO training and thus erroneously reported attending more than once); 2. A training session in 2005 (perhaps because they were more likely to recall the training and to be currently employed at UNESCO); 3. The 3-day sessions and thus field offices (perhaps because they received greater exposure to RBM and were perhaps more likely to recall the material; it is unclear why individuals who attended a one-day partial training/coaching session were far less likely to respond to the survey) 77 4. How well do you understand the concepts presented in the RBM training? (n=91) 51% Very well 47% Somewhat well 2% Not at all What is the reason that you are having difficulty understanding RBM concepts? (Applies to those who answered “somewhat well” or “not at all”) 13% The training materials were not clear 4% The trainers were not clear 67% It is difficult to understand how to apply the concepts 38% The training period was too short 4% English-language issues posed a problem for me 18% The exercises were not helpful for me 18% I do not understand all of the jargon Other (some answers duplicate categories above): - Difficulties with concepts a. Difficult to retain for long periods b. Difficult to match concepts to UNESCO practice c. Terminology wasn’t clearly defined d. Concepts seemed too theoretical e. Concepts are complex f. Confusion between RBM, SISTER, and FABs concepts - Difficulty understanding how to apply concepts to UNESCO practice a. Concepts seem formulated for business sector b. RBM concepts seem difficult to apply to UNESCO work which is long-term when RBM seems to focus on the short-term (1-2 yrs) c. It is difficult to know how to develop indicators to measure outcomes that do not easily lend themselves to measurement d. The concepts seem useful, but it is difficult to apply in field offices if they are not applied in headquarters first - Concerns about the training: a. Trainers did not speak Spanish, making comprehension an issue at times b. The session was too short c. The session seemed improvised d. There was insufficient practical application e. The in-house trainers were not known to be subject-matter experts 5. Have you developed or contributed to a work plan? (n=90) 89% Yes 11% No 78 6. Please describe your ability to formulate "expected results" for a work plan. (Applies only to individuals who responded “yes” to question 5) I find formulating "expected results" to be: 9% Very easy 61% Moderately easy 25% Moderately difficult 4% Very difficult 1% I do not know how to formulate "expected results" 7. Has your ability to formulate "expected results" for a work plan improved as a result of the RBM training? (Applies only to individuals who responded “yes” to question 5) 35% Yes, very much 56% Yes, somewhat 9% No, not at all 8. Please describe your ability to formulate "performance indicators" for a work plan. (Applies only to individuals who responded “yes” to question 5) I find formulating "performance indicators" to be: 6% Very easy 59% Moderately easy 27% Moderately difficult 5% Very difficult 3% I do not know how to formulate "performance indicators" 9. Has your ability to formulate "performance indicators" for a work plan improved as a result of the RBM training? (Applies only to individuals who responded “yes” to question 5) 29% Yes, very much 62% Yes, somewhat 9% No, not at all 10. Do you apply the concepts presented in the RBM training to aspects of your work other than developing a work plan? (n=87) 74% Yes 26% No 79 [For yes] Please provide examples of how you use the information, tools, or concepts you learned in the RBM training (main categories of answers provided): - Applying a general focus on results rather than process and output in all aspects of my work - Asking myself for everything I do whether it is useful and in what sense it contributes to what I am trying to achieve - Communication - Designing stage of a project - Developing individual programme activities in a way that is clear about expected outcomes/results, and more results oriented than input oriented - Developing research papers/projects - Drafting action plans - Establishing a clearer link between the SISTER system and the C/5 - Evaluating PP requests - Examining overall goals and making decisions accordingly - Exchanges with other agencies and partners - For setting targets for the office - Formulation of results for FIT projects - Fundraising - Having the NAT COMs understand our new tools - Helping counterparts to formulate project for UNESCO consideration - Implementation of activities (monitoring, time management, allocation of responsibilities, resource management, etc.) - Insisting on concrete deliverables - Linking activities, actions and MLAs appropriately - Managing technical task forces and relevant ToRs, work plans, and products - Managing staff (analysing staff activities, controlling the planning of staff, organizing team work, human resource management, staff performance assessment) - Planning and participating in meetings; Power Point presentations - Monitoring and evaluation, including performance indicators and measurement - Participating in the preparation of common country exercises (e.g. UNDAF) - Planning, implementing, and budgeting activities - Planning activities such as workshops and conferences - Preparing and reviewing proposals, contracts, terms of reference - Preparing the C/4 and the C/5 - Project development, implementation and management - Project preparation for EXB - Reading more critically than before documents and programmes from others to see whether the means are proportionate to the goals and objectives - Report writing in general, mission reports and progress reports - Reporting in the office - Reporting on project results and information dissemination - Strategic planning for programmatic matters - Trying to think of ways to measure success - Using SISTER - Website redevelopment 80 [For No] What is the primary reason that you do not apply RBM concepts to other work activities? 24% I do not have the opportunity to do so 24% I do not understand the concepts well enough to apply them to my work 10% I do not have the time 24% RBM does not apply to my job 19% Other *Some individuals who indicated “other” were re-categorized as “RBM does not apply to my job” based on the qualitative answer provided. Other: - It impeaches innovation - RBM seems formulaic - It has been a long time since the training - It is necessary for our partners to apply RBM concepts and for us to assist them with this 11. Please indicate if you agree or disagree with the following statements using the response categories below. (n=85) Disagree Disagree Agree Agree Do not Strongly Somewhat Somewhat Strongly Know I was knowledgeable about RBM before I completed the 18% 21% 38% 18% 5% training My learning needs were assessed before I completed the 19% 37% 24% 12% 8% training After the training, I felt that I had learned information and/or 4% 5% 46% 46% 0% acquired skills that I could use for my work The date and time of the 6% 8% 40% 45% 1% training were convenient for me I understand how RBM relates 5% 6% 35% 52% 2% to my work 12. Please indicate your satisfaction with the duration of the RBM training course in terms of your ability to learn the concepts and develop RBM skills. (n=83) 48% It was just right 46% It was too short 6% It was too long 13. Which category best describes you? (n=94) 63% Professional staff (grades P, D, ADG, DGG) 17% General staff (grades G, L) 20% Other/no response 81 14. What sector do you work in? (n=94) 7% Culture 4% Communication and Information 32% Education 22% Natural Sciences 4% Social and Human Sciences 6% Multiple response 23% None of the above/no response 15. If you would like to comment on topics covered by this survey please feel free to do so below: On satisfaction with the training: - The RBM workshop was excellent - The RBM trainers were excellent communications - Participation in the training contributed considerably to staff professional development - The training was extremely useful and provided a lot of insight regarding the planning and monitoring of work plan. - Individual assistance provided after the training was valued and helpful. - The RBM training contributed to better writing of proposals and reports - The training was very professional, with a sense of humor needed to address a complex topic - The training was practical and relevant. It helped office staff to review and revise the Cluster Strategic Priorities and Actions 2003-2007. On the need for practical examples: - The training needs to be more closely related to actual work being done - Exercises and examples, especially related to SISTER, should be given - Theory and practice shouldn’t be separated. It is important to understand how to customize RBM theory to UNESCO work practice. - By the time RBM is used for the first time in the preparation of a work plan, individuals can forget a bit of what was learned in the workshop if there is insufficient opportunity to practice applying the concepts. On a perception of organizational constraints - RBM is not fully applied at UNESCO - Senior staff need to value and apply RBM concepts themselves - UNESCO needs simplicity. - There is a need for more staff in the field and less staff at headquarters to reduce work loads. As a result, there will be enough time to focus and apply RBM well. - Working in an office in an unpredictable and emergency situation makes it difficult to plan the programmes in advance in the RBM format. - There seems to be an inconsistency between RBM and the way work planning occurs, particularly with respect to the role of budgetary in program planning 82 - Wide-ranging, more abstract programmes are more likely to encounter difficulties formulating expected results and performance indicators On the timing and duration of the trainings: - Conducting training in August (when all specialists are on leave) means that not everyone benefits. - Target trainings sessions scheduled around planning periods (e.g. beginning of new biennium) - Three days is too short to grasp all the concepts; the training could be extended to five days. - The training was intense during the three day period and should be broken up - Three days of RBM training competes with other work responsibilities - Two days of intensive training are enough. On dissatisfaction with the training: - Next time choose better consultants - There was a mix of discussion on RBM and SISTER, combined with practical training on activities already planned for the next biennium, which was confusing. - It is difficult to see the benefits (except bureaucratic and for formulaic reporting) of RBM for the goals of UNESCO in Science. - RBM training was of no use whatsoever. - The trainer seemed to give more attention to the director of the office, and people who spoke French than people who spoke English. On the desire for more training and additional support: - There is a desire for additional training. - A regular follow-up of the training should take place, with refresher sessions occurring in groups or even online. - Additional training on indicators and monitoring would be useful - An obligatory e-learning course on RBM would be useful, particularly to training new consultants who are unaware of about RBM - The RBM should be regular training exercise Other suggestions: - Move from supply to demand driven training and from theoretical content to a practical one - Train groups of people working together on the same project. - Plan the training more in advance. - Assess training needs in advance - Devote more time to the development of performance indicators - especially those related to outcomes and processes that don't easily lend themselves to measurement - Provide practical trainings in which the National Commissions can participate 83 13 RBM TRAINING SCHEDULE 2003-2005 Table 8 : RBM Training Schedule, 2003-2005 (Number of Enrolled Participants in Parentheses) 2003 2004 2005 3 Addis Abba regional (7) Teacher training coaching (9) January Nairobi (12) CLT coaching (13) 3 4 February Primary Education (16) 1 IOC Division (13) March Cairo office (17) April Kingston office (14) 1 May Amman office (18) Bureau of the Budget (6) Beirut office (20) June Ramallah (15) 1 Jakarta office (13) July Windhoek office (17) Jakarta office (12) August Dakar office (15) 1 Bangkok regional (72) 1 Bangkok office (13) Bangkok office (13) September Dakar office (19) 1 Bangkok office (19) Doha office (8) Dakar sub-regional (7) October 2 Beijing office (21) Quito office (19) IOC workshop/coaching (22) November Quito regional (12) Science sector coaching (43) 3 Montevideo office (19) Bonn UNEVOC (10) December 1 Partial training, one day, 2 Coaching sessions, two days, 3 Coaching sessions, one day, 4 Partial training, half day TOTAL ENROLLED PARTICIPANTS: 513 Non-UNESCO Ramallah, Palestinian Authority 15 Jordanian National Commission 2 UNRWA 1 National Commission (Lebanon) 2 TOTAL UNESCO STAFF: 493 (includes duplicates) 84 14 GENERAL BIBLIOGRAPHY "Extract from the Decisions Adopted by the Executive Board at its166th Session (166 Ex/Decisions) (Paris, 4-16 April 2003)." In Draft Programme and Budget For 2004-2005 (32 C/5 and Corr.). RBM in UNDP: Overview and General Principles (Introduction to Results-Based Management), New York: United Nations Development Programme RBM in UNDP: Selecting Indicators (Signposts of Development), New York: United Nations Development Programme RBM in UNDP: Technical Note (Knowing the What and the How), New York: United Nations Development Programme UNFPA policy statement on RBM, New York: United Nations Population Fund Speech of the Director-General at the Closing Ceremony of the 30th Session of UNESCO's General Conference, Paris: UNESCO, November 17, 1999 Results-Based Management at UNFPA, New York: United Nations Population Fund, 2000 June Review of Management and Administration in the United Nations Educational, Scientific and Cultural Organization (JIU/REP/2000/4), 160 EX/41. Paris: UNESCO, August 18, 2000 Strengthening Country Office Operations in Managing for Results, New York: United Nations Population Fund, February 9, 2000 Results-Based Management Orientation Guide, New York: United Nations Population Fund, 2001 June Medium Term Strategy 2002-2007: Contributing to Peace and Human Development in an Era of Globalization Through Education, the Sciences, Culture and Communication, 31 C/4 Approved. Paris: UNESCO, February 2002 Report of the Director-General on the Activities of the Organization in 2000-2001, Communicated to Member States and the Executive Board in accordance with Article VI.3.b of the Constitution, 32 C/3. Paris: UNESCO, 2002 Decisions Adopted by the Executive Board at its 166th Session, 166 EX/Decisions. Paris: UNESCO, May 14, 2003 From the Director-General to DDG, All ADGS, Directors of Central Services and UNESCO Institutes, Directors of Field Offices on the Preparation of Workplans for 32 C/5, Memorandum. July 7, 2003 85 Report by the Director-General on the Reform Process, 32 C/32. Paris: UNESCO, September 24, 2003 Analytical Review of the Execution of the Programme and Main Results Achieved During the First 18 Months of the 2004-2005 Biennium, 33 C/INF.3. Paris: UNESCO, September 29, 2005 Information Concerning the Implementation of the Programme and Budget for 2004- 2005, 33 C/INF.3. Paris: UNESCO, September 29, 2005 Report by the Director-General on the Activities of the Organization in 2002-2003, Communicated to Member States and the Executive Board in Accordance with Article VI.3.b of the Constitution, 33 C/3. Paris: UNESCO, 2005 Report by the Director-General on the Progress Made in the Implementation of the Recommendations of the External Auditor on Audits Already Undertaken, 33 C/INF.9. Paris: UNESCO, October 4, 2005 Report by the External Auditor on the Performance Audits Undertaken in the 2002- 2003 Biennium, 33 C/INF.8. Paris: UNESCO, October 4, 2005 Reports of the Joint Inspection Unit (JIU) of Interest to UNESCO and the Status of Implementation of Recommendations of Previous Reports and on Results Obtained, 171 EX/38. Paris: UNESCO, February 28, 2005 Approved Programme and Budget 2006-2007, 33 C/5 Approved. Paris: UNESCO, 2006 Biennial Evaluation Report on the Activities and Results of all UNESCO Decentralized Bodies, 174 EX/20. Paris: UNESCO, February 23, 2006 Joint Report by the Director-General on the Implementation of the Programme and Budget (32 C/5) and on Results Achieved in the Previous Biennium 2004-2005 (Draft 34 C/3), 174 EX/4 - Draft 34 C/3. Paris: UNESCO, March 17, 2006 Report by the Director-General on the Reform Process, Part I, Staff Policy, 174 EX/6. Paris: UNESCO, February 10, 2006 Reports by the Joint Inspection Unit (JIU) of Interest to UNESCO and the Status of Implementation of Approved/Accepted Recommendations of Joint Inspection Unit Reports, 174 EX/33. Paris: UNESCO, February 10, 2006 Cooley, Lawrence, Joan Goodin, and Peter Bracegirdle. Report on the Institutionalization of Results-Based Management at UNFPA, New York: United Nations Population Fund, May 31, 2000 86 Dufresne-Klaus, D. "Performance Assessment Policy." Administrative Circular 2205 (2004). Meier, Werner. Results-Based Management: Towards a Common Understanding Among Development Cooperation Agencies, Discussion Paper (Ver. 5.0). Ottawa: Results-Based Management Group, October 15, 2003 Moller, Birgitte. UNF/UNFIP Project Document, WHC/AO/2005/64. Project document. Paris: UNESCO, July 25, 2005 Ortiz, Even Fontaine, Ion Gorita, Sumihiro Kuyama, Wolfgang Münch, Guangting Tang, and Victor Vislykh. Overview of the Series of Reports on Managing for Results in The United Nations System, JIU/REP/2004/5. Geneva: Joint Inspection Unit, 2004 Ortiz, Even Fontaine, Ion Gorita, and Victor Vislykh. Delegation of Authority and Accountability, Part II, Series on Managing for Results in the United Nations System, JIU/REP/2004/7. Geneva: UN Joint Inspection Unit, 2004 Ortiz, Even Fontaine, Ion Gorita, and Victor Vislykh. Managing Performance and Contracts, Part III, Series on Managing for Results in the United Nations System, JIU/REP/2004/8. Geneva: UN Joint Inspection Unit, 2004 Ortiz, Even Fontaine, Sumihiro Kuyama, Wolfgang Munch, and Guangting Tang. Implementation of Results-Based Management in the United Nations Organizations, Part I, Series on Managing for Results in the United Nations System, JIU/REP/2004/6. Geneva: UN Joint Inspection Unit, 2004 RTC Consultants and UNESCO Bureau of Strategic Planning. Results Based Programming, Management, and Monitoring (RBM) Guide, Paris: UNESCO UNESCO Bureau of Strategic Planning. “Expectations of the Windhoek RBM training workshop.” Documentation provided by Bureau of Strategic Planning. UNESCO Bureau of Strategic Planning. Letter of invitation to participants for the Jakarta training workshop, July 7, 2005 UNESCO Bureau of Strategic Planning. Invitation letter to participants for the Windhoek training workshop, June 24, 2004. UNESCO, Bureau of Strategic Planning. RBM Training programme: Focusing on results, Windhoek UNESCO Office, Agenda. July 6-8, 2004 UNESCO Bureau of Strategic Planning. RBM Training programme: Focusing on results, Jakarta UNESCO Office, Agenda. July 19-21, 2005 UNESCO Bureau of Strategic Planning. Results Based Management Training Workshop – Windhoek, 6-8 July 2004. Workshop File 87 UNESCO Bureau of Strategic Planning. Results Based Management Training Workshop – Bonn, 16-18 November 2005. Workshop File UNESCO Bureau of Strategic Planning. RBM working document for the preparation of 33 C/5 Activities, Training material. UNESCO Bureau of Strategic Planning. Requests by BSP for 2004 Training Funded From Corporate HRM Funds, Submission to the Learning and Development Commission. October 24, 2003 UNESCO Bureau of Strategic Planning. SISTER Monitoring Qualitative Report, Formations pour la gestion fondee sur les resultats (activity) UNESCO Bureau of Strategic Planning. SISTER Monitoring Qualitative Report, Results Based Management (RBM) Training (Office 5) UNESCO Bureau of Strategic Planning. Report: Implementation of Allocated Training Funds 2003, Excel spreadsheet. UNESCO Bureau of Strategic Planning. 39321101 (TCD/CCA-UNDAF), Work plan. 2006 UNESCO Bureau of Strategic Planning. 39323201 (TCD/RBM), Work plan.2006 UNESCO Bureau of Strategic Planning. RBM and Common Country Programming Training, 37641202 (BSP/Training). . UNESCO, Learning and Development Commission. Meeting Minutes, July 13, 2005. UNESCO, Learning and Development Commission. Meeting Minutes, One-Day Retreat, 23 May 2005. 15 BIBLIOGRAPHY FOR THE RBM SECTION 15.1 Internal reports and documents Dufresne-Klaus, D. Performance Assessment Policy. Administrative Circular 2205 (2004). Joint Inspection Unit (JIU), Interest to UNESCO and the Status of Implementation of Recommendations of Previous Reports and on Results Obtained, 171 EX/38. Paris: UNESCO, February 28, 2005 88 Meier, Werner. Results-Based Management: Towards a Common Understanding Among Development Cooperation Agencies, Discussion Paper (Ver. 5.0). Ottawa: Results-Based Management Group, October 15, 2003 Moller, Birgitte. UNF/UNFIP Project Document, WHC/AO/2005/64. Project document. Paris: UNESCO, July 25, 2005 Ortiz, Even Fontaine, Ion Gorita, Sumihiro Kuyama, Wolfgang Münch, Guangting Tang, and Victor Vislykh. Overview of the Series of Reports on Managing for Results in The United Nations System, JIU/REP/2004/5. Geneva: Joint Inspection Unit, 2004 Ortiz, Even Fontaine, Sumihiro Kuyama, Wolfgang Munch, and Guangting Tang. Implementation of Results-Based Management in the United Nations Organizations, Part I, Series on Managing for Results in the United Nations System, JIU/REP/2004/6. Geneva: UN Joint Inspection Unit, 2004 Ortiz, Even Fontaine, Ion Gorita, and Victor Vislykh. Delegation of Authority and Accountability, Part II, Series on Managing for Results in the United Nations System, JIU/REP/2004/7. Geneva: UN Joint Inspection Unit, 2004 Ortiz, Even Fontaine, Ion Gorita, and Victor Vislykh. Managing Performance and Contracts, Part III, Series on Managing for Results in the United Nations System, JIU/REP/2004/8. Geneva: UN Joint Inspection Unit, 2004 RTC Consultants and UNESCO Bureau of Strategic Planning. Results Based Programming, Management, and Monitoring (RBM) Guide, Paris: UNESCO UNDP, RBM in UNDP: Technical Note (Knowing the What and the How), New York: United Nations Development Programme UNDP, Internal Review of the UNDP Bureau for Crisis Prevention and Recovery, by Trevor Gordon-Somers, Dr. Elizabeth Scheper, Thierry Senechal and Dr. Charles Alao, Report dated 4 October 2005. UNDP, Introduction to Results-Based Management, Overview and General Principles, Technical Note obtained during the UNDP BCPR review conducted by Thierry Senechal in 2005. UNDP, RBM in UNDP: Selecting Indicators (Signposts of Development), New York: United Nations Development Programme UNESCO, On the Training in Results-Based Programming, Budgeting, Management and Monitoring (RBM) - Focusing on Results in the Preparation of Work Plans. UNESCO, 2003. UNFPA, Strengthening Country Office Operations in Managing for Results, New York: United Nations Population Fund, February 9, 2000. 89 UNFPA, Results-Based Management Orientation Guide, New York: United Nations Population Fund, 2001 June. UNFPA, Policy statement on RBM, New York: United Nations Population Fund UNFPA, Results-Based Management at UNFPA, New York: United Nations Population Fund, 2000 June 15.2 Internet links consulted: 15.2.1 UNDP (http://www.undp.org/eo/methodologies.htm) Visit website for more. “Signposts of Development - RBM in UNDP: Selecting Indicators” http://www.undp.org/eo/documents/methodology/rbm/Indicators-Paperl.doc “Introduction to Results-Based Management - RBM in UNDP: Overview and General Principles” http://www.undp.org/eo/documents/methodology/rbm/RBM-Overview-GP.doc “Knowing the What and the How - RBM in UNDP: Technical Note” http://www.undp.org/eo/documents/methodology/rbm/RBM-technical-note.doc 15.2.2 UNFPA (http://www.unfpa.org/results/index.htm) “Report on the Institutionalization of Results-Based Management at UNFPA” by Lawrence Cooley, Team Leader, MSI, Joan Goodin, MSI, Peter Bracegirdle, Appian, Wednesday, May 31st, 2000” http://www.unfpa.org/results/docs/finalrpt-inst.doc “UNFPA Policy Statement on Results-Based Management” http://www.unfpa.org/results/index.htm “Results-Based Management Orientation Guide.” United Nations Population Fund (UNFPA), Strategic Planning and Coordination Division (SPCD), Office for Results- Based Management (ORM). June 2001 http://www.unfpa.org/results/index.htm “Introduction to Results-Based Management at UNFPA" United Nations Population Fund (UNFPA), Strategic Planning and Coordination Division (SPCD), Office for Results-Based Management (ORM). June 2000 90 http://www.unfpa.org/results/index.htm “Strengthening Country Office Operations In Managing For Results” (ORM input to Action Coordination Team (ACT) III: Strengthening CO operations). 9 February 2000 http://www.unfpa.org/results/index.htm 15.2.3 UNICEF “Understanding Results Based Programme Planning and Management: Tools to Reinforce Good Programming Practice” UNICEF Evaluation Office and Division of Policy and Planning, September 2003 www.unicef.org/evaluation/files/RBM_Guide_20September2003.pdf 15.2.4 OECD “Results-Based Management in the Development Co-operation Agencies: A Review of Experience, Background Report 2001, OECD” http://www.oecd.org/dataoecd/17/1/1886527.pdf 15.2.5 World Bank 2006 Annual Report on Operations Evaluation, Independent Evaluation Group (IEG), The World Bank http://siteresources.worldbank.org/EXTANNREPOPEEVA/Resources/AROE06- full.pdf 91 16 End Notes for the RBM section i Comprehensive Review of Governance and Oversight within the United Nations, Funds, Programmes and Specialized Agencies. Volume III, Governance - Current UN practices, Gap Analysis ii For further details on the organizations and agencies having adopted RBM approach, see Ortiz, Even Fontaine, et al. “Overview of the Series of Reports on Managing for Results in the United Nations System.” Geneva: Joint Inspection Unit, 2004. JIU/REP/2004/5. iii Note on RBM, Operations Evaluation Department, World Bank, 1997. “Results‐based Management in Canadian International Development iv Agency”, CIDA, January 1999. v See UNDP, RBM in UNDP: Selecting Indicators (Signposts of Development), New York: United Nations Development Programme, p. 20. vi See UNFPA Policy Statement on RBM at http://www.unfpa.org/results/docs/policy.doc vii The author conducted an extensive review of UNDP BCPR performance framework in 2005. The data and analysis were done at that time. viii This is described in greater length in UNDP, Introduction to Results-Based Management, Overview and General Principles, Technical Note obtained during the UNDP BCPR review conducted by Thierry Senechal in 2005, pp. 1-8. ix See the report entitled Internal Review of the UNDP Bureau for Crisis Prevention and Recovery, by Trevor Gordon-Somers, Dr. Elizabeth Scheper, Thierry Senechal and Dr. Charles Alao, Report dated 4 October 2005, pp. 20-33. x Introduction to Results-Based Management, Overview and General Principles, Technical Note obtained during the UNDP BCPR review conducted by Thierry Senechal in 2005, p. 7. xi See also the document entitled “Cost-efficient approaches to providing programme- level data”. DP/2006/CRP.2, 22 December 2005. xii See the report entitled Internal Review of the UNDP Bureau for Crisis Prevention and Recovery, by Trevor Gordon-Somers, Dr. Elizabeth Scheper, Thierry Senechal and Dr. Charles Alao, Report dated 4 October 2005, pp. 20-33. xiii Please see the Report entitled 2006 Annual Report on Operations Evaluation, Independent Evaluation Group (IEG), The World Bank. The IEG is an independent, 92 three-part unit within the World Bank Group. IEG-World Bank is charged with evaluating the activities of the IBRD (The World Bank) and IDA, IEG-IFC focuses on assessment of IFC’s work toward private sector development, and IEG-MIGA evaluates the contributions of MIGA guarantee projects and services. xiv 2006 Annual Report on Operations Evaluation, Independent Evaluation Group (IEG), The World Bank, p. 11. xv 2006 Annual Report on Operations Evaluation, Independent Evaluation Group (IEG), The World Bank, p. 98. xvi See the report entitled Internal Review of the UNDP Bureau for Crisis Prevention and Recovery, by Trevor Gordon-Somers, Dr. Elizabeth Scheper, Thierry Senechal and Dr. Charles Alao, Report dated 4 October 2005, p. 31.
Pages to are hidden for
"Onitoring and Evaluation Training Opportunities"Please download to view full document