London Development Agencys 2012 Games Legacy Impact Evaluation Study

Document Sample
London Development Agencys 2012 Games Legacy Impact Evaluation Study Powered By Docstoc
					London Development Agency’s 2012 Games Legacy Impact Evaluation Study
Notes for Bidder Question and Answer Session held on 2 February 2009.


Proposed Agenda

      Welcome & Introductions
      LDA Evaluation Context
      Overview of LDA Olympic Legacy programme & impact study
      Questions & Answers
      Next Steps

Proposed Attendees:
Dr Michelle Reeves, Senior Policy Manager, Olympic Legacy Directorate, LDA
Rachel Massey, Head of Programme Management Office, Olympic Legacy
Directorate, LDA
Simon Griffiths, Gateway Manager, Corporate Programme Management Office, LDA
Mick Stokes, Head of Strategic Procurement, LDA
Caroline Brooks, Project Manager, Olympic Legacy Directorate, LDA

Due to the adverse weather conditions, a number of the LDA officers and potential
bidders were unable to attend the session. As a result a shortened presentation and
Q & A session was given by Simon Griffiths and Michelle Reeves to those that were
able to attend. The session will be re run on 18.02.2009 at 13.00 at Palestra, 197
Blackfriars Road, London SE1 8AA.

These notes provide an overview of discussion at the session.

Simon Griffiths, the LDA’s Gateway Manager, provided an overview of the LDA’s
broader evaluation context and work to prepare the 2008 London RDA Impact
Report, setting out how it is influencing the LDA’ new programme/project governance
arrangements and the expectations of this 2012 Games legacy impact evaluation
study.

Key headlines:

      LDA using BERR evaluation guidelines as the methodology for all LDA
       project and programme evaluations. They are a set of guidelines that the
       Agency aspires to use, but in the course of preparing the 2008 London RDA
       Impact Report, the Agency has learnt lessons that will help to improve the
       BERR guidelines as well as the way the LDA will conduct evaluations in the
       future;
      The RDA Impact Report due to be published 3 Feb 2009, but delayed
       pending a decision by BERR & HMT about the communication messages
       they want to use. The LDA included 78% of all relevant evaluations (project,
       programme and strategic evaluations) in its report. This covered a wide range
       of interventions except marketing and promotion initiatives which will be
       evaluated in the future;
      LDA has trialled the use of cost benefits analysis for all evaluations conducted
       last year with help from SQW Ltd. Trial period now passed and it is now
       mandatory for all projects to obtain cost benefits ratios. Going forward, the
       LDA wants to focus on quantifying social and environmental benefits, as well
       as economic benefits;




                                          1
      This ties in with the LDA’s new programme/project governance structure
       based on the OGC Gateway model, comprising 5 gateways: (A) concept
       appraisal, (B) strategic/business case (Green Book compliant), (C)
       contracting (ensuring what is contracted matches what has been approved),
       (D) performance monitoring, and (E) evaluation and project closure;
      Key driver for this new governance structure is new LDA board which has
       established two new committees: an investment committee and an audit, risk
       and performance committee both of whom have a strong appetite for looking
       at evaluation and what to see ROI on all investments. They are interested in
       opportunity costs and why interventions work or don’t work;
      Our expectations for this study are: quantitative impact evaluation to
       demonstrate to stakeholders how we have spent our 2012 investments and
       provide accountability to BERR, HMT, the Major and London Assembly and
       evidence of how we have secured outcomes/targets; evidence of
       understanding of the relevant of market failure and how this translates into an
       intervention – therefore we want logic chains to show how the problem has
       been conceptualised, and the use of case studies to explain the cause and
       effect chain regarding what has worked and not worked. Additionally, the
       Agency wants to identify how we have influenced the spending patterns of
       others, and the legacy potential (sustainability) of interventions in terms of an
       exit strategy; value for money – the LDA need to understand how expensive
       our interventions are to deliver and we will be looking for the bidder to deliver
       this;
      LDA evaluation studies, including this one, will ensure that programme/project
       designs continually deliver value for money;
      In terms of the approach to LDA project evaluations, all projects with
       investment under £1m will be self-evaluated using the LDA’s self-evaluation
       toolkit. All projects with investment over £1m will have to be externally
       evaluated.

Questions were solicited on the above:

Q1: It sounds like a comprehensive approach to evaluating everything that is
involved in the Olympics. Is it too ambitious?

A1: At the programme evaluation level, this study is not about the influence of the
Olympics, but an assessment of LDA investment in Olympic programmes. We are
looking for evidence of where the Agency has had the most influence that has
brought about a change in behaviour as a result of our interventions e.g. changes in
how other stakeholders are spending their budgets. Partnership working needs to
have tangible outcomes with respect to spending patterns. The starting point for this
analysis is examining what we are doing together with what others are doing and
who is influencing who.

Q2: To identify who is influencing who will be quite difficult requiring a forensic
examination.

A2: It is important to evidence the explicit actions of the LDA that is influencing others
based on what we have set out to do in terms of project/programme objectives and
how we have set out to influence others. For this study it will be important to keep it
high-level and simple, and identify the most important relationships.

Q3: What is the relationship of the study to the DCMS Olympic and Paralympic
Evaluation Framework?



                                            2
A3: The DCMS Evaluation Framework is a framework of principles that the DCMS
hope that evaluators will adopt. It also suggests a number of impact domains but the
Government Olympic Executive (GOE) has not yet confirmed a work programme to
implement the Framework. The LDA is in discussions with GOE to identify what
national 2012 evaluation studies may be planned and to ensure that these studies
and any economic models that they specify are compatible with the LDA evaluation
methodology and study.

It is important that there is consistency in these studies to ensure that we are
maximising the benefits of our study. The same goes for the approach to cost
benefits analysis. Initially only the LDA were undertaking CBAs for all our
evaluations, but other RDAs are now starting to do this. The Government needs to
identify what approaches it wants to see applied with respect to CBAs.

Comment: There are no agreed national frameworks for monetarising social benefits.

A4: For the LDA’s Gateway B, we ask for CBAs for all projects and the LDA is just
finalising how it will evaluate social benefits. We are looking at a number of models
to identify a way of valuing more early/social interventions and there should be
compatibility between this work and any CBA work undertaken as part of this study.

Q5: Does the LDA intend to publish this work?

A5a: Yes it will be made available to consultants augment existing LDA evaluation
guidelines.

Comment: It would be beneficial for the voluntary sector to assist with the evaluation
of their work.

A5b: The other action that the LDA is undertaking is building a database from our
evaluation findings, identifying additionality, benchmarking unit costs for example, to
help us develop our own benchmarks. Alongside this, Prof. Peter Tyler is taking all
the impact studies from all the RDA regions and will be issuing a set of guidelines to
enable assessment of leakage/displacement. This report will be published later this
year.

A5c: Whilst we are working towards improving our capability to assess the impacts of
our investments, what is unknown is about how the DCMS is going to look at the
impact of their investments or whether they want to do this.

Michelle Reeves ran through a PowerPoint presentation setting out the LDA’s role in
the 2012 Games, where programmes are up to, reminding consultants about the
objectives of the study, the study’s outputs and methodological requirements. She
focused on two areas of methodology. What the LDA could provide in terms of
information to assist the feasibility study and an outline of some work that we have
commissioned from Grant Thornton around performance management and
beneficiary tracking to highlight some of the consideration that consultants will need
to bear in mind when developing their methodology.

Key points (additional to slides)


Key outputs for the study: we recognise there are a number of unknowns in terms
of data available and the quality of data that’s why we have built in a feasibility period
to enable you to examine the information to hand. We expect your bids to include an


                                            3
outline methodology, but we want to work with you and independent experts to feed
in best practice in impact methodologies to get the very best and workable
methodology for our budget.
Overall study methodology – we expect an economic model that can look at the
impacts of Olympic investment in London and take account of inputs and outputs. We
will be working closely with GOE on any national studies and it will be important that
both Government and the Agency use a consistent economic model.
Performance management data and beneficiary tracking - likely to be one of the
most challenging areas of the study. Important to make you aware of some of the
work we have been doing and some of the issues this has thrown up which you will
have to consider when developing your methodology. LDA-led/supported legacy
programmes contain a mixture of wholly LDA funded & managed projects, LDA
funded, but externally managed projects, joint LDA funded projects with other
stakeholders, projects not funded or managed by the LDA.
For LDA projects, delivery partners are required to collect registration data,
output/activity data and outcome data, grant agreements/contracts with delivery
partners carry clauses requesting additional data. LDA also creating its own
beneficiary database, we are working with key partners to share data, some data
convergence but no formal protocols in place. Consultants unlikely to be able to track
beneficiaries solely through performance management data – will need a survey or
modelling approach. Even if we had perfect coverage through our performance
management systems, we would not capture all benefits therefore consultants need
to factor this into their study methodology.
The study budget has been reduced to £520,000 because we have not secured
some other public sector funding. As a result, we have had to reduce some
programme budgets, and we have also re-allocated some budget to the Employment
and Skills Benefits (LEST) Programme to take account of the LEST beneficiary
survey. See PowerPoint presentation for new indicative programme budgets. There
is only limited flexibility to move budgets between programmes.

Questions were solicited about any points made in the presentation.

A6: Just a key point to make in relation to this study, we obtained an exemption from
including its land delivery work for the Games and Compulsory Purchase Order in the
2008 RDA Impact Report. Within this impact study, it will be important to assess the
advantages we had to do this, and its catalytic or accelerating effects in terms of the
physical redevelopment of the Olympic site.

Q7: I am concerned about to what extent to which we will need to capture partners’
investments (reference to the study objectives). You will need to invest a lot of
resources as this will be very labour intensive.

A7a: The focus of this study is on the LDA’s role and this requires a depth of
analysis, but we do need to also look at the overall effect. We will be looking for
consultants to differentiate between which partners to include, and how this will be
dealt with. The study will be looking at our influence on others, for example Jobcentre
Plus, we are not really looking at their influence on us, but ours on them. This will
involve looking at how this relationship works, but the extent to which we do this for
other partners will depend on what type of project it is.

A7b: We are looking for consultants to prioritise examination of partners’
investments/activities based on the level of LDA investment and who is investing.




                                          4
A7c: If we take our sports programmes, If we look at the say £20m we are investing,
we can look at this in terms of where the strategic influence will be. In the context of
wider investment in sports programmes, is our investment catalytic?

Q/comment 6: From a London perspective, people don’t mind who delivers, whether
its Jobcentre Plus, LDA etc. It is important to look at what are the benefits but it is
important to ask the beneficiaries through primary research.

A8: The LDA has moved a long way in terms of its approach to evaluation. The
project evaluations are important and we need to get these right. All the LDA’s big
projects have evaluation processes and investments in place. A lot of the projects
would have happened anyway. The role of the consultant is to act as a sounding
board to extract the ‘Olympic effect’.

Q9: You have said that surveys will be an important part of this study, are you
expecting beneficiary surveys for all programmes?

A9: Yes, but not necessarily at the scale of the LEST beneficiary survey, except apart
from the Olympic Park and Lower Lea legacy programme. We would expect some
form of primary research that captures beneficiary experiences and outcomes.

Q/comment 9: In your specification for the LEST beneficiary survey you initially
specify a +/-confidence level for wave 4, but later you say you want a +/-1.5
confidence level. Which is it you want as it makes a big difference?

A9: We will clarify the confidence level we require and let you know.

Q/comment 10: Experian in their LEST evaluation plan suggest a minimum sample
size at wave 4. This is not likely to be enough. Is that correct?

A10: Yes, it is not likely to be a large enough sample.

Q11: Are you looking for a longitudinal or cross-sectional survey for the LEST
beneficiary survey?

A11: We are looking for a longitudinal survey in that we want to track the progress of
a sample of individuals from their entry into legacy projects over time. However we
are also specifying topping up to deal with sample attrition over time and also to
capture new potential beneficiaries who may come to projects at later stages.




                                           5

				
DOCUMENT INFO