Docstoc

Evaluating and Monitoring Study Support Activities

Document Sample
Evaluating and Monitoring Study Support Activities Powered By Docstoc
					Monitoring and Evaluating Study Support Activities

One of the purposes of the government‟s drive to develop extended services in and
around schools is to raise standards in schools. As part of the core offer of extended
services, programmes of Study Support are the most readily achievable way of doing
that.

Study Support is at the heart of school improvement. It is relevant to every child in
every school because it has three overlapping purposes, each of which has raising
students‟ achievement at its core:
     Removing obstacles and developing readiness for learning
     Increasing competence at learning
     Broadening and deepening success in learning.

Study Support can contribute to raising standards either directly (for example through
increased attainment as a result of activities such as paired reading schemes,
homework clubs or revision sessions) or indirectly (for example through increased
motivation or improved attendance in school).

School governors and the senior leadership team will want to review how their Study
Support programme is supporting the delivery of the national strategies and school
improvement plan targets, and ensuring that the five outcomes of Every Child Matters
are met.

Therefore it is important to monitor the activities provided, and to evaluate their
success. This can then be used to further develop the programme of activities.


What is monitoring?

Monitoring is an ongoing process – to look at what is happening with your study
support programme, for example who is attending and what they are doing when they
are at the club. An example of monitoring is keeping registers of attendance – you can
show that an activity is successful if pupils keep coming back.


What is evaluation?

Evaluation is a structured way of thinking about what happens during your project, and
why. It can be simple or complex, depending on the resources you have available and
what you want to find out. A key part of any evaluation process is setting objectives
and intended outcomes for the projects, and then gathering evidence to see whether
these objectives have been achieved. The diagram below shows the steps you may
take in your evaluation process.
      The Evaluation Process
      From Evidence and Evaluation, Teachernet website


                                        PLANNING
                               Determining what you want to
                               find out & how you will identify
                                           success


                               COLLECTING EVIDENCE
                               by using appropriate research
                                           tools



                                        ANALYSIS
                                Assembling and interpreting
                                          data



                                       REPORTING
                                       on and sharing
                                       findings/results



                                      REFLECTING
                                 and moving forward using
                                findings to improve practice




Planning your evaluation

The evaluation planning template in the toolkit is a useful tool for setting out what you
want to achieve and how you will go about it.

Quality assuring study support activities

As part of the evaluation of your study support programme, you should carry out
regular quality assurance checks on the activities that you are running, particularly
those offered by external providers. It is also good practice if you are able to do this for
activities that you signpost to.
You may like to involve others in carrying out the observation visits, for example
governors, or pupils. The QA template in the toolkit can be used when you undertake a
quality assurance visit – remember to discuss your observations with the providers
concerned as a means of improving provision.
Your observation notes might include comments on safeguarding pupils, how the tutor
is evaluating their activities and measuring impact, pupil comments etc.
The evaluation schedule in the toolkit is a useful way of providing an overview of the
visits you have undertaken. You may decide that you would like to visit all clubs on a
rotational basis, or that you focus your attention on external providers or those judged
to be performing at a satisfactory level or below.


Measuring the Impact of your study support programme

One important evaluation question to ask in relation to your study support activities is
“what impact is the activity having on the young people taking part?”. This could be
done for individual activities or for the programme as a whole -what difference does
study support in general actually make? This information can then be used for a wide
variety of purposes, as summarised in the diagram below:

                                                     JUSTIFICATION
                                                     Are you meeting your needs and
   RECRUITMENT                                       objectives?
   of pupils                                         Funding
   of staff                                          Reporting e.g for SEF
   of organisations


                                 Why measure impact?



RESEARCH
Assess outcomes
Effects on specific groups                IMPROVEMENT
Trends over time                          Targeting
Value for money?                          Programme details eg locations / times / staffing
                                          Inform planning and help shape future
                                          programmes



Many schools and organisations have found that measuring the impact of study
support activities is not always an easy task, given the wide-ranging benefits that
reportedly result from such activities and the difficulty of isolating the impact of study
support. Questions of „what to measure‟ and „how to do it‟ are common, and there is no
one correct answer. It may be that you want to measure the direct impact of study
support on academic attainment, or that you want to focus on soft skills such as
emotional awareness or self-esteem. Some data may be easier to collect than others –
as shown in the diagram below:


                                            MORE
           EASY                           DIFFICULT



                                                Attitude to school
             School attendance
                                                Attitude to learning
             Test scores
                                                Self esteem
             Skill acquisition
                                                Confidence

                                                Motivation

                                                Enjoyment
There are numerous different methods for collecting the data you require – these are
discussed in more detail in The Evaluation Factor (Wigan Council, 2006). Whichever
tool you use, you will need to collect either qualitative or quantitative data.


Types of Evidence: qualitative and quantitative data

Evidence based on quantitative data is numerical. Any kind of figures such as data on
participation (i.e. records of attendance), percentage of participants who are satisfied
with the service, reading ages, pupil attainment, number of reported crimes in the area
etc are all examples of quantitative evidence. Such evidence may be obtained using
research tools such as administrative records, survey questionnaires or local area
statistics.

Evidence based on qualitative data is non-numerical and based upon interpretations,
observations, accounts and opinions. Examples include participant comments,
observations of activity/staff involved, interviews and focus groups and open ended
questions on questionnaires.

It is usually preferable to gather a mix of both qualitative and quantitative evidence,
and to investigate changes over time.


Where to start?

The measuring impact planning sheet in the toolkit can help you with planning a more
in depth evaluation of the impact of a particular activity.

It follows a similar structure to the Evaluation Process diagram above: you should start
with a question you would like answered, such as “Does attendance at the science
club improve attainment in science?”. You will then need to consider the data you will
need to collect. Because impact relates to improvement, you will need to consider data
for both before the activity (baseline data) and afterwards. For the above example you
would need to collect attendance data (to know who has attended the club), baseline
data such as predicted GCSE results and post activity data such as actual GCSE
grades.


             Possible Questions to ask

             To what extent did activity X achieve its objectives?
             Does attendance at activity X improve academic
             attainment?
             Does attendance at activity X develop personal
             skills?
             What type of pupil benefits most from activity X?
Data collection resources

As mentioned above, Wigan Council has produced an informative guide to evaluating
Study Support activities which explains many of the potential data sources for your
evaluation. It also contains resources that can be photocopied or alternatively adapted
to suit your own particular needs or used as a stimulus for developing your own
activities. It can be downloaded from
http://www.wigan.gov.uk/Services/EducationLearning/Schools/ExtendedHours/OutofSc
hoolHours.htm.

The National Evaluation of the Children‟s Fund (NECF) has also produced The
Evaluator’s Cookbook – a resource book of participatory evaluation exercises for use
with children and young people. It can be downloaded from http://www.connexions-
leics.org/ns/pdc/docs_pdfs/step6/Evaluator's%20Cookbook.pdf.

The toolkit contains several resources that may be of use in evaluating the impact of
study support, including school-wide surveys, and an impact evaluation for younger
pupils. These can of course be adapted to suit your own needs. You might also like to
think about getting pupils involved in designing or carrying out the research.


Alternative methods of data collection

Don‟t be afraid to try out different methods of data collection to find out what works well
for your school and community.

Some schools have found that using an online survey tool has really helped increase
response rates. It is very simple to create a basic online survey using free tools such
as www.surveymonkey.com or www.surveymethods.com. These programmes can also
be used to collate results, although you will need to subscribe to be able to access the
more advanced features of the survey tools (discounts may be available however for
educational institutions).

Kent Children‟s University recommends the use of interactive voting tools as a really
effective way to engage young people in evaluating activities. There is a range of
hand-held voting pads available, from simple ones with „yes‟ and „no‟ buttons, to more
complex tools with text entry functions. The software makes it easy to create question
and answer sessions and children and young people enjoy using them. An additional
benefit is that responses are instant and can be traced back to individuals. Qwizdom is
one provider of such tools (www. Qwizdom.co.uk). Many schools may find that they
already have a set being used in classrooms, or cluster may like to share a set for use
in evaluating their extended services, including study support.


Other things to consider

As well as thinking about what data you are going to collect you should consider your
sample size and whether a control group is possible.

Sample size is only a concern when you want to draw general conclusions from your
findings. If you are measuring the impact of a single activity it is likely that you can
assess the whole group and won‟t need to consider issues of sample size and
composition. If there are too many students to record the results of the whole group
you will need to use a sample group, and it is important to ensure the sample is of
sufficient size for results to be representative. As a general rule, a sample size of 30 is
considered sufficient for statistical purposes. This issue most often arises when a
school wishes to look at the overall impact of study support, i.e. comparing the
progress of pupils who id not attend any form of study support against those who did.

Control groups are helpful in determining whether observed outcomes are the result of
the study support activity or whether other factors had an influence. The control group
should ideally be similar in every way to those who attended the activity, the only
difference being that the control group did not attend. An ideal control group is often
difficult to set up, but conclusions will always be more reliable if even an imperfect
control group is used.


Hints and Tips

        Make sure what you ask is relevant and covers the main issues. If
        information is interesting but you are not really going to use it, don‟t
        waste people‟s time by collecting it

        Make clear the aims and objectives of the project, for example to
        increase the number of girls attending after school ICT activities, and
        work out your key indicators of success, e.g. regular attendance of
        60% of the target group, more parents from vulnerable families
        contacting the school informally etc

        Try to establish baseline quantitative evidence before the
        project/service begins, for example the number of late arrivals or the
        number of unauthorized absences. Then you can collect the same
        evidence at the end of the project and use it to assess any changes.


        Try to encourage the involvement of key stakeholders in gathering
        evidence. This will promote engagement with, and ownership of, the
        project. For example, students may be involved with the questionnaire
        design and in the collating of responses; gather views from a range of
        people such as members, teaching staff, partners and parents


        Be creative in your approach to collecting evidence – for example,
        questionnaires can be created online, feedback can be collected using
        photographic or video evidence (but be aware of safeguarding issues
        around collecting evidence – you will need written permission from the
        children and their parents/carers to take photographic or video
        evidence, and data protection issues need to be considered if you are
        collecting personal information about individuals)


        Work out what you are going to do with the information – know how
        you will collect, analyse and feedback the results
        Make one person responsible for the process of collecting data and
        designate people to analyse and feedback the results


        Use your feedback to feed forward! In other words, use your data not
        only to establish the extent to which projects have met their aims and
        objectives, but also to inform future planning.




And most importantly, don’t be afraid to start!
Evaluating your study support activities does not have to be overwhelming – start
small, perhaps by looking at the attendance and enjoyment of one particular activity,
then build up to more detailed evaluations such as looking at improvement related to
one particular club, which you can use to inform targeting and planning. You may then
feel able to move on to a larger scale measurement of the impact of study support
across the school and action research into study support and its effects.




Quality in Study Support (QiSS)

The Quality in Study Support (QiSS) programme provides a range of services to
support the development of study support and other extended services.

QiSS produce the Study Support Code of Practice – the new version is titled
“Extended Learning opportunities– a framework for self-evaluation in Study Support”.
This document sets out principles of good practice in study support and provides key
indicators and a process for self review. The QiSS process can be used as a tool to
ensure the quality of Study Support in a school, leading to the award of a recognised
quality mark.

One of the main aims of QiSS is to provide support and professional development
training for schools, among other organisations, in the impact, measurement and
evaluation of study support provision. They have produced a set of leaflets to assist in
measuring impact, which are available on the QiSS website.

For more information about QiSS please contact the Cambridgeshire County Council
extended learning team or visit the QiSS website: www.canterbury.ac.uk/qiss/.

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:1
posted:10/28/2011
language:English
pages:7
xiaohuicaicai xiaohuicaicai
About