Learning Center
Plans & pricing Sign in
Sign Out

Framework for the Evaluation of the Structural Funds in Ireland


									Framework for the Evaluation of the Structural Funds in Ireland

David Hegarty, NDP/CSF Evaluation Unit, Ireland

Paper prepared for Fifth European Conference on the Evaluation of the Structural Funds, Budapest, 26-27 June 2003

Contact details: David Hegarty, Senior Evaluator, NDP/CSF Evaluation Unit, Frederick Buildings, South Fredrick Street, Dublin 2, Ireland Tel: +353-1-6045333 Fax: +353-1-6045334 Email:

1. Introduction
This paper charts the development of the framework now utilised for the evaluation of structural fund programmes in Ireland over the last decade or so. Following this introduction, a background section summarises some key features of the three Community Support Framework (CSF) programmes for Ireland covering the periods 1989 to 1993, 1994 to 1999 and 2000 to 2006. The next section examines developments in evaluation structures over the period, including the organisation of the evaluation function and the role of evaluation in the management of structural fund programmes. This complements the following section, which traces developments in the evaluation approach or focus of evaluation work. A noteworthy aspect of this approach has been a strong and increasing emphasis on the critical issue of the rationale for different types of public investment. The final section examines how evaluation has influenced the formulation of investment strategies and the allocation of structural fund resources.

2. Background
Ireland has been a recipient of significant structural fund transfers since the late 1980s. During both the 1989 to 1993 and 1994 to 1999 periods, Ireland was treated as an Objective 1 region. Total structural funds expenditure in Ireland amounted to €4.2 billion under the CSF 1989 to 1993 and €5.8 billion under the CSF for 1994 to 1999, equivalent to about 1.7 per cent of average GDP over the entire period. In the current 2000 to 2006 programming period, Ireland has been designated as two NUTS II regions: the Border, Midland and Western (BMW) region (which enjoys Objective 1 status to 2006); and the Southern and Eastern (S&E) region (which qualifies for transitional objective 1 funding to 2005). Including the 4 per cent performance reserve, the total structural funds allocation to Ireland under the CSF 2000 to 2006 amounts to €3.2 billion (equivalent, on an annual basis, to about 0.4 per cent of 2001 GDP). Reflecting its Objective 1 status at the time, the main focus of both the first two CSF programmes was on raising the long-term growth potential of the Irish economy and promoting economic convergence with the European Union. A secondary objective was to tackle the then serious labour market imbalances in the economy, reflected in a significant investment of EU resources (via the ESF) in the development of human resources. (The unemployment rate in Ireland averaged 14 per cent over the first half of the 1990s, with long-term unemployment accounting for 59 per cent of total unemployment). The spectacular performance of the Irish economy in the latter half of the 1990s has been such that these objectives have largely been attained. Over the period of the second CSF (1994 to 1999), the Irish economy, as measured by GDP, grew at an average rate of 9 per cent per annum compared with a CSF target of 4 per cent. Unemployment fell from just under 15 per cent of the labour force in 1994 to under 6 per cent by 1999.


Figure 1: GDP per capita (PPS terms) in Ireland, EU-15=100 (Source: EUROSTAT)
120 100 80 60 40 20 0 1994 1995 1996 1997 1998 1999 2000

Reflecting these developments, the emphasis of the current NDP and CSF is somewhat different. The largely unanticipated pace of economic growth in the late 1990s has exacerbated significant shortcomings in the stock and quality of important economic and social infrastructure in areas such as transport, housing, environmental services and health. Concerns have also risen around the pattern of economic growth, both in terms of its spatial distribution and the persistence of social exclusion problems amongst certain groups. Accordingly, the key objectives of the current NDP/CSF are to address the economy’s infrastructure deficit and to promote more balanced regional development and social inclusion.
Figure 2: Unemployment rate in Ireland (Source: CSO)
16 14 12 10 8 6 4 2 0 1994 1995 1996 1997 1998 1999 2000

3. Evaluation Structures: Evolution and Development
3.1 Introduction Ireland makes for an interesting case study in the development of structural funds evaluation systems. For one thing, there was little prior tradition of formal evaluation of public expenditure programmes in the Irish public administration. Secondly, Ireland now has an extensive experience with the evaluation of structural fund programmes across three successive programming rounds. There now follows a brief history of how


evaluation systems and arrangements have developed over the period from the early 1990s. 3.2 CSF 1989 to 1993 and 1994 to 1999 The period of the first CSF (1989 to 1993) saw the introduction of a number of elements of evaluation. A significant development was the establishment (in 1992 and 1993 respectively) of the ESF Programme Evaluation Unit and the Industry Evaluation Unit. These units were established on an independent basis under the aegis of the government departments responsible for management of the operational programmes for human resource development and industrial development respectively. (At a later stage in 1995, a similar unit was established under the aegis of the operational programme for agriculture and rural development.) These units carried out a significant volume of interim measure-level and thematic evaluation work over the period from their establishment to 1999. The 1994 to 1999 CSF saw a major expansion in evaluation output. At the outset, the CSF and the component operational programmes were the subject of ex-ante evaluations or prior appraisals. While these evaluations were commissioned at a rather late stage in the process of programme negotiation and had little in terms of an immediate impact, they laid the basis for an extensive process of subsequent ongoing and mid-term evaluation. Over the period of the CSF, most of the operational programmes had an ongoing evaluator presence in the form of either an independent internal evaluation unit or an external evaluator. In 1996, the CSF Evaluation Unit (now NDP/CSF Evaluation Unit) was established to co-ordinate the evaluation effort across the CSF and to promote good practice in evaluation methodology. A considerable volume of evaluation work was carried out in this period; in its 1998 review of the ongoing evaluation function in the CSF, the CSF Evaluation Unit identified some 115 separate interim evaluation outputs across nine operational programmes over the period 1995 to 1997. It also noted that, taking into account the cost of the mid-term evaluation process, evaluation costs to end 1997 were running at some 1.2 per cent of CSF expenditure to that point. Although not a regulatory requirement at the time, a comprehensive mid-term evaluation and review of the CSF was undertaken in 1996 and 1997, with evaluations commissioned at both OP and CSF level. A co-ordinated approach was taken to the process with OP evaluations being conducted to a common terms of reference and acting as inputs to a subsequent overall CSF-level evaluation.1 This enabled the CSF evaluation to concentrate on cross-programme, systemic issues. The mid-term evaluation of the CSF, undertaken by the Economic and Social Research Institute, is widely regarded – both in Ireland and in the EU – as an example of best practice in the evaluation of large, diverse programmes. The overall approach and key methodologies utilised in the evaluation are discussed later.


See paper presented by David Hegarty and Patrick Honohan, Mid-Term Evaluation of CSF for Ireland, European Conference on Evaluation Practices in the Field of Structural Policies, Seville, March 1998


3.3 NDP/CSF 2000 to 2006 There have been a number of developments in the evaluation process in the current programming period. The current set of structures have been informed by lessons learnt from the 1994 to 1999 round when, as noted above, a significant volume of evaluation work was carried out through both internal and external evaluators. As mentioned earlier, the CSF for Ireland 2000 to 2006 provides for a total structural funds investment of some €3.2 billion, co-financing a total CSF public investment of just under €5.5 billion. The CSF forms part of the wider National Development Plan (NDP) accounting for total public (including EU) investment of some €51 billion. Thus, the NDP includes significant components that are not co-financed by the structural funds or where the overall structural funds aid rate is quite low. However, the Irish authorities decided to apply the structural funds monitoring and evaluation arrangements to all NDP investment, regardless of funding source. Thus, for monitoring and evaluation purposes, the NDP and CSF are implemented as integrated programmes. In this respect, the current programming period marks a significant extension of the CSF evaluation regime, such that the NDP/CSF evaluation process covers about one fifth of annual total public expenditure in Ireland. In the sub-sections below, the organisation of the ex-ante, interim and mid-term evaluation processes is reviewed. 3.3.1 Ex-ante evaluation This integrated approach to evaluation can be traced back to the ex-ante evaluation process, which was organised in two stages.2 In the first phase, the Department of Finance invited tenders for an assessment of national investment priorities for the 2000 to 2006 period, on the basis of an agreed set of broad government objectives for the period. This tender was issued at an early stage (March 1998) in the preparation of the NDP and coincided with the invitation of submissions from the various regional, social partner and sectoral interests. The overall purpose of the exercise was to provide the Government with an independent, expert analysis of the investment needs of the economy. As part of the assignment, the evaluators (Economic and Social Research Institute) undertook a critique of the submissions received from the various interest groups. The methodological approach employed in the report – National Investment Priorities for the Period 20002006 (published by the ESRI in March, 1999) – is summarised in the next section. The second stage in the evaluation process commenced when the Plan was nearing completion. To satisfy the requirements of Article 41 of the regulations on ex-ante evaluation, the CSF Evaluation Unit (now NDP/CSF Evaluation Unit) was asked to prepare a formal ex-ante evaluation of the Plan (as submitted to the Commission). This report - Ex Ante Evaluation of the National Development Plan, 2000-2006 – was, to some extent, a follow-up to the earlier ESRI analysis. The focus and approach of this evaluation are reviewed below in section 4.2.2.

For more details see paper Ex Ante Evaluation Process for the 2000-2006 Period in Ireland submitted by David Hegarty and John Fitz Gerald (ESRI) to the Fourth European Conference on Evaluation of the Structural Funds, Edinburgh, September 2000


3.3.2 Ongoing (Interim) Evaluation As noted above, the practice of carrying out interim evaluation – i.e., evaluations between the ex-ante and mid-term intervals and subsequently between and the mid-term and expost stages – was a significant feature of the evaluation regime in the 1994 to 1999 period. This has been carried through to the current period, although with some changes in the organisation of the process. The interim evaluation process is now organised at an overall NDP/CSF level by the NDP/CSF Evaluation Unit, rather than on an individual operational programme basis (i.e., with separate OP level evaluation units and/or external evaluators) as was the case under the 1994 to 1999 CSF. In the early stages of the programming period (up to around end 2000 when the programming complement documents were submitted to the Commission), the work of the Evaluation Unit was concentrated on the development of indicators. Based on a guidance document produced at an earlier stage, the Unit provided hands-on assistance and advice to programme managers on the selection and quantification of indicators.3 The next step in the process was the preparation by the Evaluation Unit of an interim evaluation work programme for the period 2001 to 2003 (i.e., for the period to the midterm evaluation). This work programme, comprising some 16 proposed evaluation projects across the NDP/CSF, was adopted by the NDP/CSF Technical Assistance Monitoring Committee in May 2001. (This committee is chaired by the managing authority of the NDP/CSF and includes representatives of the operational programme managing authorities and, in an advisory capacity, the Commission). The work programme aimed to produce evaluation outputs which, as well as acting as inputs to the mid-term evaluation, would represent timely and useful evaluation outputs in their own right. A number of projects were selected on the basis of concerns or issues raised in the ex-ante evaluation. In this way, the work programme was designed as a bridge between the ex-ante and mid-term evaluations. The work programme was subject to a number of revisions to take account of delays in programme start-up (which rendered some proposed projects redundant) and in the recruitment of evaluator staff to the NDP/CSF Evaluation Unit. In the event, some seven such evaluations have been carried out over the period 2001 to 2003; 5 of these were commissioned externally with the other two being carried out internally by the NDP/CSF Evaluation Unit. The main focus of these evaluations has been on programme management and implementation aspects, including issues such as project selection and appraisal, project management, indicators, targeting of measures and potential problems of overlap and duplication. In line with the detailed procedures set out in the CSF, completed evaluation reports have been formally considered by the relevant operational programme monitoring committees. These procedures require that the relevant managing authority must, following any necessary consultation with implementing bodies, submit a formal response on each evaluation recommendation to the monitoring committee.

CSF Evaluation Unit, CSF Performance Indicators: Proposals for 2000 – 2006 Programming Period, October 1999 (available at


Subsequently, the managing authority is obliged to report back to the monitoring committee at regular intervals on progress made in implementing agreed recommendations. These procedures represent an important innovation in the evaluation system compared with the previous CSF when evaluation reports often received little or no formal consideration by monitoring committees. 3.3.3 Mid-Term Evaluation The mid-term evaluation process is currently underway, with evaluations underway at both NDP/CSF and operational programme level. Drawing on the experience of the 1994 to 1999 round, the process has been organised again in a highly co-ordinated manner. A decision was taken in mid-2002 to establish a mid-term evaluation planning group at NDP/CSF level to co-ordinate the process. This group brings together representatives of the OP managing authorities and the relevant Commission directorates under the chairmanship of the NDP/CSF managing authority. The NDP/CSF Evaluation Unit acts as secretariat to this group; the group also functions as the steering committee for the mid-term evaluation of the NDP/CSF. In terms of its co-ordination function, the committee agreed an overall timetable for the evaluation process and to core terms of reference for the OP level evaluations. In practice, these core terms of reference have required little more than minor adaptation to the particular circumstances at operational programme level, resulting in an essentially common evaluation approach at the level of the three national and two regional operational programmes. The terms of reference for the NDP/CSF mid-term evaluation complement those at OP level, having been drafted by the NDP/CSF Evaluation Unit and approved by the mid-term evaluation planning group (steering committee).

4. Evaluation Approach and Focus
This section complements the previous one on evaluation structures. It traces the development and application of the approach to CSF-level evaluation in Ireland. A key feature of this has been an increasingly sophisticated approach to the analysis of the rationale or market failure justification for different types of investments under the CSF. The rationale issue has now been formally endorsed as one of five key evaluation questions in an agreed evaluation approach set out in the current CSF 4.1 CSF 1989 to 1993 and 1994 to 1999 4.1.1 Ex-post evaluation of CSF 1989 to 1993/ex-ante evaluation of CSF 1994 to 1999 The first major, CSF-level evaluation of structural fund programmes in Ireland was that prepared by the Economic and Social Research Institute in 1993.4 The report, commissioned by the Department of Finance, served both as an ex-post evaluation of the 1989 to 1993 CSF and an (ex-ante) analysis of investment priorities for the 1994 to 1999

ESRI, The Community Support Framework – Evaluation and Recommendations for the 1994-1997 Framework ,Final Report to the Department of Finance, April 1993


CSF.5 The evaluation employed a mix of macroeconomic and microeconomic methodologies; this combination was to become a feature of subsequent CSF-level evaluation work in Ireland. Using a medium-term model of the Irish economy, the report estimated the macroeconomic economic effects of the 1989 to 1993 CSF on the Irish economy. However, the authors cautioned against reading too much into these results stating that “This analysis is based on a series of vital assumptions about the likely rate of return from different types of investments. While our analysis brought together a range of evidence on this issue, the evidence remains weak and patchy. Thus the results of the analysis must be viewed with considerable caution.” Subsequent research and evaluation work have enabled more robust estimates to be arrived at, thereby enhancing the reliability of model estimates. The approach used to identify priority areas for investment in the 1994 to 1999 CSF was more microeconomic in nature. The authors used a range of criteria to inform their recommendations in this regard. A strong emphasis was put on the prospective rate of return from different types of investments. The report recognised that the returns from many projects might be private in nature, and, that “even where the private rate of return from a project is high, CSF funds should only be used to fund it where there is a clearly established case of market failure which will prevent the beneficiaries from funding it themselves” (emphasis added). As will be discussed later, future evaluation work was to develop a more sophisticated methodology around the concept of market failure. A further point highlighted in the evaluation and developed in subsequent evaluation work was the need to recognise that the opportunity cost of EU funds was identical to that of national exchequer funds and that this represented an appropriate benchmark against which to measure the rate of return. 4.1.2 Mid-Term Evaluation of 1994 to 1999 CSF The organisation of the mid-term evaluation process of the 1994 to 1999 CSF has been described above. The mid-term evaluation of the CSF, carried out by the ESRI, was informed by evaluations at operational programme level. The evaluation report again combined macroeconomic and microeconomic methodologies and built on the approach used in the earlier ex-post evaluation of the CSF 1989 to 1993 as described above. In particular, the evaluation employed an innovative and sophisticated methodology for the microeconomic analysis of measures under the various operational programmes. The microeconomic analysis began from the principle that public policy should be directed towards correcting distortions. The evaluators contended that, rather than simply meeting agreed objectives, each spending programme should have to pass a more rigorous test, namely, whether it reduced distortions sufficiently to justify the additional cost involved (in the form of increased taxation). This philosophy required that each measure justify itself in terms of the opportunity costs of public funds.


Subsequently, the European Commission commissioned a separate appraisal of the National Development Plan, 1994 to 1999 as submitted by the Irish authorities.


Building on these principles, the microeconomic analysis proposed and employed a new functional classification of measures. Under this approach, spending measures across nine operational programmes were divided into 4 categories:6   First, spending to provide services thought to have a "public good" characteristic that would inhibit their optimal provision in the private sector. Second, measures chiefly designed to alter relative prices facing private firms and individuals in order to correct for some externality; in other words, what is known as a corrective subsidy. Third, targeted schemes designed to alter behavior where private agents are thought to be inadequately informed or where a specific externality or information barrier exists. Fourth subsidies whose chief effect is redistributional in character.



Each measure and sub-measure was assigned to one of the four categories. Within each category, then, it was easier to compare like with like, even across different operational programmes. The evaluators identified an "anchor measure" of each category - some scheme which was well understood and which could provide an initial reference point against which the performance of others could be placed. For each of the 4 categories, a checklist of criteria was developed for the purpose of an initial screening of the more than 100 measures across the 9 operational programmes. This initial screening identified a number of measures performing poorly in terms of the criteria used, which were then subject to a more detailed examination. This resulted in the ultimate identification of some 18 measures where the evaluators questioned the need for the level of planned financial allocation and 8 measures that needed additional funds. 4.2 NDP/CSF 2000 to 2006 4.2.1 An Agreed Evaluation Approach A significant development in the current period is the formal adoption of an agreed evaluation approach, comprising a set of 5 key evaluation questions. This is formally set out in the CSF, which states that “All evaluations will be required to address the five key evaluation questions of Rationale, Relevance, Effectiveness, Efficiency and Impact.” This approach to evaluation was recommended in the CSF Evaluation Unit’s earlier review of the ongoing evaluation function in the 1994 to 1999 CSF.7 This report found that a major

More details on the methodology and the classification system used can be found in the paper Mid-Term Evaluation of the CSF for Ireland submitted by David Hegarty (CSF Evaluation Unit) and Patrick Honohan (ESRI) to the European Conference on Evaluation Practices in the Field of Structural Policies in Seville, March 1998 or from the published evaluation report: Honohan., P., (ed), EU Structural Funds in Ireland, A Mid-Term Evaluation of the CSF, 1994 –99, July 1999 7 CSF Evaluation Unit, Review of Ongoing Evaluation Function in the CSF for Ireland, 1994 –1999, October 1998 (available at


problem with evaluation systems in the 1994 to 1999 CSF was the absence of a shared, common understanding of the purpose of evaluation, with significant differences evident in the focus or emphasis of evaluation work across different operational programmes. The approach encapsulated in the 5 key evaluation questions emerged from a review of evaluation literature as well as drawing on previous CSF-level evaluation work carried out in Ireland. The influence of earlier ESRI work – with its emphasis on the importance of justifying investment by reference to clear market failures - is apparent from the inclusion of the question of rationale in the evaluation approach. The other evaluation questions of relevance (i.e., what are the implications of external developments for the programme?), effectiveness (i.e., is the programme meeting its objectives?), efficiency (i.e., are programme benefits commensurate with costs?) and impact (i.e., what are the net socio-economic effects of the intervention?) will be familiar to evaluation practitioners and are implicit in the evaluation provisions of the Structural Fund regulations. In terms of implementing this approach over the programme cycle, it is recognised that the emphasis to be given to the different questions will vary depending on the stage at which a particular evaluation is being undertaken. This is recognised in the CSF which states that “The emphasis in any individual evaluation, will however, depend on the specific circumstances, including the breadth of focus proposed, the point in the programme cycle at which evaluation occurs, timing constraints, and possible need to examine particular issues.” In operational terms, the evaluation approach is being implemented over the programme cycle as follows:   At the ex-ante evaluation stage, the emphasis of the evaluations carried out (see below) was on the question of rationale; The focus of interim evaluation (including the mid-term evaluation as well as other pre mid-term interim evaluation work) is on issues around programme relevance, effectiveness and, to a lesser extent, efficiency and impact (in that order). Post mid-term, the intention is to concentrate evaluation work on the questions of programme efficiency and impact.


The adoption of this evaluation approach has provided a valuable “anchor” for the NDP/CSF evaluation process. In particular, the adoption and communication of this schema has helped create a greater awareness and understanding of the purpose and focus of evaluation work among programme managers and other stakeholders. The key evaluation questions underpin both the design of evaluation work programmes and terms of reference for individual evaluation projects. For example, the terms of reference for the mid-term evaluations at OP and NDP/CSF level explicitly formulate the evaluation objectives in terms of the questions of programme effectiveness and relevance. 4.2.2 Ex-Ante Evaluation As outlined earlier, the ex-ante evaluation of the current NDP/CSF was organised in two stages, with an initial assessment of investment needs followed by a formal ex-ante


evaluation of the NDP. The approach taken in these two evaluation reports is discussed below. In line with the agreed evaluation approach, a strong emphasis was placed in both evaluations on the rationale or market failure justification for different types of investment. As outlined above in section 3.3.1, the Department of Finance commissioned an independent analysis of investment needs in 1998. The approach taken in the report, carried out by the ESRI comprised the following broad analytical steps:8     As a first step, the study considered the evidence from Ireland, and elsewhere in the EU, as to the importance and role of different types of public investment in promoting economic growth. Secondly, the report examined the likely growth of the economy over the planning period with a view to determine the key constraints the economy was likely to face. These constraints having been identified, the report established the broad priorities for investment under the new NDP as well as a range of other, noninvestment supporting policy measures. Building on this analysis and again employing the market failure functional classification of investment used in the mid-term evaluation (discussed above at section 4.1.2), the evaluators produced detailed, quantified recommendations on the investment priorities necessary to achieve the objectives of the plan.

As mentioned earlier, the formal ex-ante evaluation of the National Development Plan was undertaken by the CSF Evaluation Unit in Autumn 1999. The approach taken in the evaluation is explicitly referenced to the five key evaluation questions schema outlined above. In particular, the evaluation focused strongly on the issue of the rationale or market failure justification for the different investments proposed in the Plan. In this regard, the evaluation again drew on the methodology employed in the mid-term evaluation of the 1994 to 1999 CSF (discussed above at section 4.1.2). The evaluation also focussed on the question of the continued relevance of particular investments proposed, considering the significant changes that had occurred in the socio-economic environment in recent years. The evaluation was informed by the extensive volume of evaluation work carried out under the previous CSF for evidence as to the likely effectiveness, efficiency and impact of particular interventions. In line with the terms of reference, the report refrained from making recommendations regarding financial allocations or other matters, concentrating instead on raising issues of concern, posing questions and drawing general conclusions. A number of the issues raised in the evaluation have been revisited in the context of the interim evaluation process discussed above in section 3.3.2.

For more details, see paper Ex Ante Evaluation Process for the 2000-2006 Period in Ireland presented by David Hegarty (CSF Evaluation Unit) and John Fitz Gerald (ESRI) to EU Commission Conference on Evaluation of Structural Funds, Edinburgh, 2000 or the full report: Fitz Gerland J. et al (eds.), National Investment Priorities Report for the Period 2000-2006, ESRI Policy research Series, No. 33, March 1999 (Executive summary available at


5. Conclusions/Utilisation
This final section of the paper considers issues around the utilisation of evaluation. Is evaluation regarded as important by programme managers and other key stakeholders? To what extent has evaluation influenced the design of structural fund programmes and the allocation of resources? Has evaluation made a difference? There is some evidence that evaluation has influenced both programme design and the allocation of structural fund resources. The mid-term evaluation of the CSF 1994 to 1999, discussed earlier both in terms of the organisation of the process and the nature of the evaluation approach taken, is a case in point. As concluded in the paper prepared for the Seville conference (cited earlier), the findings of the mid-term evaluation made a significant contribution to the mid-term review and clearly influenced the decisions taken on the reallocation of structural funds by the Commission in conjunction with the national authorities.9 The CSF Monitoring Committee in July 1997 agreed to a package of mid-term financial adjustments amounting to some €163 million, representing some 7.5 per cent of planned 1998 and 1999 structural funds expenditure. This package included reductions to the budget of 8 of the 18 measures where the level of existing financial commitment had been called into question by the evaluator. 5 of the 8 areas identified in the evaluation as requiring additional aid were allocated additional resources in the mid-term review. As also discussed in the Seville Conference paper, the philosophy of the CSF evaluation influenced some of the non-financial decisions taken in the mid-term review. The comprehensive ex-ante evaluation of the current NDP 2000 to 2006 discussed above appears to have had some influence on the shape of both the NDP and the CSF. As discussed in the paper prepared for the Edinburgh Conference, it would appear that the initial analysis of investment priorities influenced the wide-ranging public debate on the priorities of the Plan and was the subject of considerable discussion in policy-making circles and in debate around the design of the NDP.10 The paper concluded that “It is clear that this report was a key input to the Plan and that, broadly speaking, the thrust of the Plan was in line with the recommendations made.” The later ex-ante evaluation of the NDP represented an important input to the negotiations between the Commission and the national authorities on the CSF. For example, the Commission shared the concerns raised in the evaluation about the share of resources allocated to the Productive Sector (an operational programme covering investment in research and development and the industry, tourism, agriculture and fisheries sectors) in the Plan and the limited support to this sector under the CSF was focused on competitiveness rather than on employment creation. As discussed above, the conduct of interim evaluation has been a noteworthy feature of evaluation in Ireland. The interim evaluation work carried out over the last couple of years – which has concentrated on issues around programme management and
9 10

David Hegarty and Patrick Honohan, ibid David Hegarty and John Fitz Gerald, ibid


implementation – has met with a positive response and good co-operation from programme managers. It appears that the focus of the evaluations has proved relevant and useful from their perspective at this early, pre mid-term stage in the programme cycle. In addition, some of the reports have given rise to considerable media comment and public debate. For example, the recent interim evaluation of NDP/CSF investment in the road network drew attention to the significant escalation in the estimated cost of completion of the national roads programme.11 This was the subject of widespread comment in the media and elsewhere and has led to a renewed focus on the issue of cost management in the programme. As to the impact of this set of evaluations in terms of programme adjustment, it is probably too early too draw conclusions; one of the purposes of these evaluations was to input to the mid-term evaluation process which is not yet complete. More broadly, it is clear that the requirements of the EU regulations have helped promote an evaluation culture and capacity in Ireland. As mentioned at the outset, there was very little evaluation carried out in Ireland prior to the advent of significant structural fund transfers in the early 1990s. As was the case with other aspects of the management and implementation of the structural funds, Ireland responded positively to the evaluation requirements of the EU. Through the course of the three CSFs, Ireland has developed evaluation structures and approaches which are now, through the current NDP, being applied to some one fifth of total annual government expenditure. The overall lesson from the Irish experience is that a well-organised and adequately resourced evaluation system, underpinned by appropriate structures and a clear sense of purpose or focus, is an important instrument in maximising the benefits of the structural funds. Evaluations carried out at the right time by experienced and detached evaluators, with a focus on appropriate questions and the commitment of key stakeholders, can make a difference.


Fitzpatrick Associates, Evaluation of Investment in the Road Network, August 2002 (available at


Hegarty, D. and Honohan, P., Mid-Term Evaluation of CSF for Ireland, Paper prepared for European Conference on Evaluation Practices in the Field of Structural Policies in Seville, March 1998 Hegarty, D. and Fitz Gerald, J., Ex Ante Evaluation Process for the 2000 to 2006 Period in Ireland, Paper prepared for Fourth European Conference on Evaluation of the Structural Funds, Edinburgh, September 2000 CSF Evaluation Unit, CSF Performance Indicators: Proposals for 2000 – 2006 Programming Period, October 1999 (available at: ESRI, The Community Support Framework – Evaluation and Recommendations for the 1994-1997 Framework , Final Report to the Department of Finance, April 1993 Government of Ireland, European Union Community Support Framework for Ireland, NDP/CSF Information Office, Department of Finance, December 2001 (available at Honohan, P., (ed), EU Structural Funds in Ireland, A Mid-Term Evaluation of the CSF, 1994 –99, ESRI Policy Research Series Paper No. 31, July 1999 Fitz Gerland J. et al (eds.), National Investment Priorities Report for the Period 20002006, ESRI Policy research Series, No. 33, March 1999 (Executive summary available at CSF Evaluation Unit, Review of Ongoing Evaluation Function in the CSF for Ireland, 1994 –1999, October 1998 (available at: Fitzpatrick Associates, Evaluation of Investment in the Road Network, August 2002 (available at


To top