EvaluationIAToolsFinalReport 04 05 07 by HC121103194158

VIEWS: 2 PAGES: 42

									                Skills for Life
                Improvement Programme




An Evaluation of DfES Skills for Life
‘revised trial version’ Skills Checks and
initial assessment tools and diagnostic
assessment materials.



May 2007
Contents page

1. Introduction                                                                    1

2. Skills checks                                                                   4

3. Initial assessments                                                             9

4. Diagnostic assessments                                                         22

5. Key findings                                                                   29

6. Recommendations for:                                                           30
       o Future development of the tools
       o Support for effective use of the tools in practice
       o Further evaluation
7. 2 Case studies of usage                                                        31

8. Appendices

Appendix A: list of tools evaluated                                               34

Appendix B: evaluation sites                                                      35

Appendix C: models of use                                                         37

Appendix D: issues relating to particular questions                               40




Our sincere thanks to the 39 sites who participated in this evaluation project,
and to the 5 fieldworkers who visited them.
An Evaluation of DfES Skills for Life ‘revised trial version’ Skills Checks
and initial assessment tools and diagnostic assessment materials


1. Introduction
In November 2006, QIA commissioned the Skills for Life Improvement
Programme to carry out an evaluation of the literacy and numeracy screening,
initial and diagnostic assessment tools developed on behalf of the Skills for
Life unit of the DfES. The tools evaluated included literacy and numeracy
Skills Check and initial assessment tools, revised in 2006 to take account of
the outcomes of trials undertaken in 2005-2006, as well as the literacy,
numeracy, ESOL and dyslexia diagnostic assessment materials in use since
20031.

Purpose
The purpose of the evaluation was to:
    evaluate the tools in a range of learning and skills settings
    evaluate how effectively the tools are used and their purpose
      understood within a learner-centred assessment process
    describe key features of the medium, structure, format and styles of the
      tools that are effective in engaging learners and in achieving the tools’
      purpose
    provide recommendations for:
         o future developments of the tools
         o support for their effective use in practice.

Methodology
The project recruited 39 centres to evaluate the tools between January and
March 2007. Although the sample size was relatively small, sites were
selected to ensure, as far as possible, that the tools would be evaluated
across all settings – Further Education colleges, training providers, offender
learning, adult learning and the Army.2

Potential evaluation centres were identified by contacting:
    Sector Skills Councils – Asset Skills, GoSkills, Skillsmart and Skills for
      Health
    providers participating in Train to Gain Skills for Life development
      projects
    provider networks suggested by members of the Skills for Life
      Improvement Programme consortium
    centres who had participated in the initial trial of the tools in 2005-2006.

Information was sent out to potential sites to raise awareness of the revised
trial versions of the tools by signposting them to the tools library website3 and
to invite them to express an interest in participating in the evaluation.


1
  A full list of the tools evaluated can be found in appendix A
2
  A full list of evaluation sites can be found in appendix B
3
  www.toolslibrary.co.uk


                                                                                 1
An initial questionnaire sent out to interested sites sought to ensure that the
tools would be evaluated:
    within a range of delivery models such as discrete, roll-on/roll-off,
        workplace learning
    within a range of programmes such as E2E, Train to Gain, Job Centre
        Plus
    in paper-based and in computer-based format
    with a range of learner groups working across all levels of the core
        curriculum.

Centres began to pilot the tools from January 2007. Some centres
experienced delays in obtaining the tools due to:
    a problem with the ordering system at prolog resulting in a failure to
       dispatch orders received during the Christmas break in January
    delays with the availability of the Skills for Health tools
    centres who downloaded the initial assessment tools from the tools
       library website not being aware that they also had to order a CD for the
       speaking and listening element of the assessment.

The timing of the evaluation project had an impact on the centres and
programmes able to participate in the evaluation. Only centres with
programmes with continuous enrolment, such as E2E, Train to Gain, Job
Centre Plus or roll on / roll off discrete Skills for Life provision were able to
usefully evaluate the tools in the time specified. Although many centres
initially indicated their intention to use the sector specific tools, very few did.
This was due to:
      problems obtaining the Skills for Health tools
      anticipated Train to Gain delivery failing to materialise in time
      a lack of new learners interested in or currently working in these
         sectors during the evaluation timeframe
      in some cases, providers deciding the tools were not appropriate for
         the employees they were working with (usually seen as too
         ‘intimidating’,for example by an FE college working extensively with
         cleaners))

Centres were sent an evaluation sheet to record their ongoing reactions and
experiences of using the tools.

A team of 8 field workers visited the 39 centres in March 2007. The visits
aimed to capture quantitative and qualitative data through:
    semi-structured interviews with:
          o managers responsible for literacy and numeracy initial
              assessment in the organisation
          o practitioners who had evaluated the tools
          o learners who had taken the assessments

Where possible, field workers observed tools in use.4

4
    See appendix B for the number of interviews and observations undertaken


                                                                                      2
Questions focused on the following areas:
   How the assessment tool is used
   How the results are used (assessment outcomes and next steps)
   Learner responses to the process
   The effectiveness of the tools themselves
   Evaluation of their own organisation’s use of the tools
   Recommendations users would make to others
   Quality assurance, resources and training issues

A draft report of the evaluation was written by the team of field workers at a
residential at the end of March.

Structure of the report
A detailed summary of responses to the interview questions is given in three
sections each focusing on one of the three types of tools:

   Section 2 Skills checks

   Section 3 Initial assessments

   Section 4 Diagnostic assessments
      And including specific responses relating to particular sectors:
          o Further Education colleges
          o Training providers
          o Offender learning
          o Adult Learning
          o The Army
   A summary of conclusions drawn from the responses is at the end of each
   section

   Section5 Key findings

   Section 6   Recommendations for:
         o     Future development of the tools
         o     Support for effective use of the tools in practice
         o     Further evaluation

   Section 7 Appendices



Direct quotations from those interviewed are shown in italics in the text.




                                                                                 3
2. Skills Check

2.1 What are the tools?
The Skills Check tools are designed to assess whether an individual
potentially has a literacy or numeracy need and to identify learners who would
benefit from a more in-depth initial assessment.

There are six versions of this tool available:
    Standard for general purpose use in a range of contexts
    Workplace for standard workplace usage (i.e. non sector-specific
      workplace)
    Contextualised for those currently working in or with recent experience
      of working in specific sectors

                  o Asset Skills – cleaning sector
                  o Road Passenger Transport Industries
                  o Skills for Health – designed for Health Care Assistants
                    working in a variety of health care settings.
                  o Skillsmart Retail

The Skills Check tools are available in both paper-based and computer-based
format.




                                        Number/Numeracy at Level 2.

Assessors are advised to allow 15 minutes for the Skills Check to include a 5
minute preliminary interview.

Only the standard and workplace versions of the Skills Check were used by
providers in this study. A total of 226 skills checks were delivered: 196 paper-
based (71 standard and 125 workplace); and 30 computer-based (29
standard and 1 workplace).

The paper-based Skills Checks are marked by the assessor; the computer-
based tool provides an answer sheet with a tick or cross against the questions
attempted and a recommendation for next steps, for example, moving on to
Part B of the Skills Check or initial assessment. The answer sheet then has to
be printed. Results are not stored on the system.

10 providers in this study used the Skills Check. 6 were private training
providers, 1 FE College for its Train to Gain learners, 1 ACL provider and 2
offender learning providers.




                                                                                4
Overall summary of responses to the interview questions:

2.2. How the Skills Check is used
Of the 10 providers, 7 used it as part of the initial assessment process, 1 used
it for selection and 2 used it to identify possible Skills for Life needs. For most
providers it is used as part of the induction/interview process and is used for
all prospective learners. For 7 providers it is mandatory and for 5 it is seen as
a contractual requirement. One training provider reported that only 1
prospective learner had ever refused to take part in the skills check and that
learner had not been offered a place on the programme.

Providers used the tool in a variety of venues, usually in a location with ease
of access for learners. Over half the providers felt that the venue used for
Skills Check was good, being informal, well-equipped and easy to access for
learners. Most assessments were carried out as a 1:1 process and were
paper-based. 4 providers quoted lack of IT resources at venues used as a
reason for using the paper-based version.

Staff administering the skills check explained to learners that it was not a test
with a pass or fail, but would be used to help find out what level of support
they needed, to place them in the correct class or to see what they needed to
learn.

Up to 30 minutes was allowed for completion of the assessment and most
learners completed in that time.

5 out of the 10 providers did not require those administering the Skills Check
to have had training in its use; 5 ran in-house training.

2.3 How the results are used
In most providers the tool was used as an integral part of the
interview/induction process. 8 providers reported that they gave the results of
the assessment to learners immediately during the interview. Most feedback
was done as 1:1. For 7 providers feedback on the outcomes of the
assessment form part of the interview process and learners are directly
involved in deciding what the next steps are. In 7 providers the results are
kept on the learners’ file and learners proceed onto an initial assessment. 7
providers felt that the most effective approaches to giving feedback included
giving learners individual attention and to concentrate on the positives. 2
providers did not give direct feedback on the results. 1 provider using Skills
Check for its Train to Gain learners reported that there has been no follow up
from the Skills Check results. It is a contractual requirement to deliver Skills
Check or IA to learners, but no use is made of the results. As one FE
provider said:

       The results go to the SfL manager. Only one vocational team has
       asked to see the results. To date there has been no follow up from the
       Skills Checks.




                                                                                  5
2.4 Learner responses to the process
5 providers felt that learners’ overall response to the process was positive. 2
training providers reported that learners found it too easy. 1 training provider
reported that learners liked the process because it didn’t take too long and the
questions were new to them. A number of learners at 1 training provider said
that they would like the paper-based version in colour and for the illustrations
to be larger. 1 offender learning provider said that the learners struggled with
the language.

2.5 The effectiveness of the Skills Check tools
4 providers felt the tools worked well or very well with their target group. 4 felt
the tools worked satisfactorily and 2 felt they worked poorly.
Reported strengths included:
    ease of administration
    not taking too long
    appropriate to the age range with questions set in relevant situations
    immediate results
    attractive format.

Reported weaknesses included:
   not enough variety of questions
   too much emphasis on spelling
   terminology in numeracy is too technical
   difficult for the less able learners to navigate and understand.

7 providers liked the format and style, but 2 disliked the A5 size, finding it
“small and fiddly”.

6 providers found the skills check useful or very useful for the purposes which
they use it for and 3 not very useful.

6 providers found the instructions for use clear and straightforward. 1 provider
commented that it would be helpful to have a summary answer grid because
there was too much information provided in the assessor booklet.

8 providers had used different tools previously: BSA assessment (4), Target
Skills (3) and BKSB (1). 3 found the new tools more effective than previously
used tools and 3 found them less effective.

2.6 Evaluation of the organisation’s use of the tools
7 providers felt that they used the tools well; they explained the purpose well
to learners, used a sensitive, non-threatening approach to delivering the
assessment and used experienced staff. 5 felt they were flexible about the
time taken, used a good environment for the delivery of the tool and made
good use of the results. 2 felt that results were not fully used and they were
too time-constrained in the delivery of the skills check.

2.7 Recommendations to others
Providers offered the following recommendations to others considering using
the Skills Check:


                                                                                  6
      Be aware that the Skills Check is just an indicator and needs to be part
       of a wider induction process
      Learners with SfL needs will need a more in-depth assessment;
      The approach is all-important:
           o Put learners at their ease
           o Ensure that staff are fully aware of the issues
           o Read the guidelines. “Such a crucial process must be done with
              care.”
      Allow learners to choose between the paper-based and computer-
       based version
      The paper-based version gives more opportunity to talk to learners
       while they are doing it.

4 providers (1 Offender Training, 1ACL and 2 training providers) would not
continue to use the Skills Check as it was not appropriate for their needs or
did not provide sufficient information. “For the effort involved it tells you so
little.” “Wait for a more user friendly version with more appropriate language”.

2.8 Quality Assurance
1 out of the 10 reported that the initial assessment process is included in their
Quality Assurance system. 7 felt that they assure quality by the use of
experienced and qualified staff. Only 1 provider specifically mentioned Quality
Assurance with regard to the Skills Check

       “We don’t QA the Skills Checks but will now pursue the opportunity to
       do this. For the initial assessment, non teachers have to take unit 1 of
       the Level 2 Certificate in adult learner support. There is no formal
       system for the QA of screening or initial and diagnostic assessment.”

2.9 Summary of conclusions drawn from the responses to the Skills
Check
In most of the providers, the use of the Skills Checks was learner-centred. In
7 out of the 10 providers, feedback on the outcomes of the Skills Check forms
part of the interview process and learners are directly involved in deciding
what the next steps are. The results are kept on the learner’s file and learners
proceed onto an initial assessment

Generally, those administering the Skills Check liked the tool, finding it
interesting, modern and realistic and sufficiently short so that “it didn’t put
learners off”. In 5 out of the 10 providers learners were interviewed, and were
positive about the experience of taking the Skills Check.




                                                                                   7
However, responses indicate some lack of understanding of the purpose of
using a Skills Checks. Several providers compared it to initial assessment
tools such as the BSA, Target Skills and BKSB and therefore were critical of
the Skills Check in not providing the detail they were seeking about learners’
levels and needs.

Ineffective practice was seen where screening / initial assessment was seen
as a contractual requirement divorced from the learners’ main programmes.
For an FE college delivering Train to Gain programmes, the Skills Check was
seen by vocational teachers as a quick, easy to administer tool that satisfied
LSC contractual requirement and did not alienate the learners. However, the
results did not inform the learners’ programme of learning and were ‘filed
away’. This failure of response could have a very detrimental impact on the
employee. The tool was used by some training providers as a welcome
alternative for E2E learners who were repeatedly being asked to take the BSA
initial assessment.




                                                                                 8
3. Initial Assessment

3.1 What are the tools?
The Initial Assessment tools are designed to help learners and their teachers
ascertain the approximate general literacy or numeracy skills level at which
the learner is working and, therefore, if further diagnostic assessment or skills
support might be appropriate. The outcome will tell the learner whether he/she
is working at one of five adult core curriculum levels from Entry 1 to Level 2
and will feed into the diagnostic assessment process.

There are six versions available:
    Standard for general purpose use in a range of contexts
    Workplace for standard workplace usage (i.e. non sector-specific
      workplace)
    Contextualised for those currently working in or with recent experience
      of working in specific sectors

                 o Asset Skills - cleaning sector
                 o Road Passenger Transport Industries
                 o Skills for Health - designed for Health Care Assistants
                   working in a variety of health care settings.
                 o Skillsmart Retail

The tools are available in both paper-based and computer-based format.

Paper-Based
There are separate literacy and numeracy assessments. A learner answer
booklet is provided and is accompanied by an Assessor Guide, with answers
and curriculum references. The literacy tools also incorporate listening
questions on an audio cassette or CD. They each consist of 25 questions.
For the numeracy, calculators are not allowed (but working out paper is
allowed),
Computer-Based
The computer-based versions are adaptive so that the learners are steered to
questions within their capabilities (although they will be unaware of this). Full
use is made of colour and sound. Scoring is automatic and results are
immediate and can be printed off at the end of the assessment.

There are separate literacy and numeracy assessments.

Computer –based literacy consists of 75 questions – although in most cases
an individual will be required to answer only a quarter of these – to assess
listening, reading and writing. Headphones and a sound card are required to
listen to the audio clips through the computer, which has to be set up to
provide audio output.

Computer-based numeracy consists of 68 questions – although in most
cases an individual will be required to answer only a quarter of these – to
assess number; measures, shape and space; data handling. A calculator was
available on screen for some of the numeracy questions.


                                                                                9
The guidance for administering the assessments indicates that they should be
conducted by fully trained individuals, and the interpretation of the
assessment results and feedback should be overseen by an experienced
Skills for Life practitioner.

Paper-based scripts are marked and scored by the assessor using the answer
booklet or by the computer, which marks each completed question with a tick
or cross. A summary box gives an indication of the level the learner has
attained but neither version gives an indication of strengths and weaknesses

The paper-based assessment allows learners to miss out questions they can’t
do, whereas this isn’t possible with the computer-based assessment, which
also doesn’t allow learners to go back to questions, although, in the majority
of cases, answers can be changed whist still on the specific question screen.
The computer-based assessment does not provide any feedback other than
whether correct or incorrect answers were made to particular questions
completed and the level. It does not detail the skills tested in correct or
incorrect answers (as is possible with the paper-based version which has core
curriculum references), although reference to the Assessor Guide, a pdf
document which can be accessed from the opening screen, does detail the
specific core curriculum skill against each question number.

In the evaluation, a total of 1814 assessments were completed. There were
roughly equal numbers of literacy and numeracy (902 and 912 respectively).
However, approximately 75% of all assessments were paper-based in format.

38 providers used the IA tools. 3 were ACL providers, 5 used by Army
Education Centres, 9 by FE colleges, including for their workplace learning
provision, 6 offender learning providers and 15 training providers.

Overall summary of responses to the interview questions:

3.2 How the initial assessment tools are used

        Purpose
The majority of FE and ACL providers are using it to identify a level for their
learners and to provide a starting point for ILPs. Many of the training providers
will also use it to determine learner levels, and some will go further and use it
to inform the ILP and test readiness. In most cases it is seen as a contractual
requirement from the LSC.

One training provider uses the tool as part of the three-week induction
programme for Health Care Assistants (HCAs). The tool has two main
purposes – to establish whether the HCAs have the literacy and numeracy
skills to do the job, for example, the skills to write reports on the ward, or read
charts. If not, support will be provided to ensure they acquire the necessary
skills. It is also used to assess literacy and numeracy levels for further training
– many HCAs, for example, want to go on to nursing training.



                                                                                10
In the army centres, the IA is used to find the level of the learner to decide
what support they need. In all cases it is used to determine promotion
opportunities (Level 1 over three years is the aim for all soldiers, and
necessary for those young soldiers who wish to become corporals by joining
the Command, Leadership and Management Programme.)

In the ACL centres, IA is used for prospective SfL learners in order to place
them in appropriately graded classes. “They are self-presenting as SfL
learners so we don't need a skills check to verify that, we need something
more to place them appropriately. We thought it might be this IA, but we
always want to see some free writing and to talk through their aspirations.”

In the offender learning centres it is used as a starting point for determining
the appropriate education or training programme for learners, but may also be
used to determine the level of support that may be provided by a Learning
Support Assistant for young people in custody aged 15-18. It also provides an
opportunity to begin the process of interacting with the offenders as this
comment from Probation indicates:

       It’s a starting point leading to DA (diagnostic assessment) – it gives the
       level to aim the DA, and a very important way to break the ice. It is also
       the most important part of the interview as it builds rapport and shows
       offenders the style of work they will be involved in. It is also critical in
       this context as learners don't refer themselves here.

        Prospective learners with whom the tools were trialled
In the Further Education colleges a wide range of learners were involved,
adult learners in discrete provision E1 to L2, on and off site; some rural
learners, work-based employees, mainly via Train to Gain and also for a pre-
employment HGV drivers ESF project; learners on WBL health-related
courses and workplace learners from the local Primary Care Trust.

Training providers involved learners from the following broad groups: all their
current learners (one provider indicated that this will be extended to all staff in
due course); NVQ learners who need additional training in Literacy; JCP
learners, Train to Gain, E2E; new HCA employees of the Trust; classroom
assistants, administration assistants (part of the LSC contracted
requirements); those referred by the Job Centre (or by probation service);
people working in the cleaning sector, released for training during working
hours.

In the army, there is a wide range of personnel who will be given an IA for a
variety of purposes, although, from April 2006, there has been a requirement
to IA all new recruits as the following statement indicates:

       As of April 2006 all new recruits have to sit the new on-line
       assessments and have to achieve Entry 2. They are also screened for
       speaking and listening during interview and have to be assessed as
       Level 1. In the AECs you have soldiers who are referred by their Unit
       or who self-refer. A whole unit might be tested and then blocks of


                                                                                 11
       soldiers released for training. Units are obliged to release soldiers who
       are below Level 2 to attend English and Maths courses.


In Offender learning, IA is part of the Induction process, and will usually be
carried out in the first few weeks in custody. In the young people’s estate, it is
compulsory that all learners are assessed (unless they can show evidence of
achievement or IA results completed within the last six months). However, it is
usually voluntary for adult and young offenders, although rarely refused.

      Additional assessment used
Many providers indicated they used additional methods, particularly interview
(51 %) and free writing (40%) to gain further information.

       Suitability of venue
Almost 80% of centres felt that the venue was excellent or good. In FE
provision in a workplace learning context for Train to Gain, there were
examples of unsuitable venues, as shown from the following comments:

       The first course was delivered in a nightmare dingy, noisy, dirty pub
       room. We changed to a leisure centre for second course, which was
       fine.

       Usually the workplace venues are okay, but occasionally you’re all
       hunched over a laptop in a manager's office, or doing an IA in a
       crowded canteen

       Paper-based versus computer-based
The majority of IAs taken used the paper-based version. This is often because
of lack of access to ICT equipment (given as an explanation by 25% of
providers)

However, other reasons include the fact that older learners may feel less
comfortable with computer-based assessment, although many younger
learners, e.g. in an offender context, prefer the interactivity of computer-based
assessment. The army feel that all their personnel are IT literate and therefore
the soldiers prefer using the computer through their LearnDirect centres.

There are some differences of opinion as to the information potential provided
by the method of delivery of IA e.g. in the paper-based assessment,
handwriting and observed reading pace can give clues to level, against the
ease of marking and immediacy of available information to feedback through
the use of the computer-based assessment.




                                                                               12
       Time allowed/taken
Most centres report that the time allowed for IA is between 45 minutes and an
hour, and that the majority of learners completed it within 45 minutes

       Staff training
Most staff appear to have no formal training to deliver IA, although in-house
training appears to be the main method of ensuring staff are competent to
undertake effective assessment.

It does, however, vary greatly, and can range from high level qualification
requirements to none at all, as the following examples illustrate. In FE,
requirements range from no training identified, although acknowledged as
desirable, to all those delivering assessments having to be qualified to Level
4, and all variants between. In ACL, assessments were undertaken by
experienced SfL staff who were not perceived to need training. In the army
Basic Skills Co-ordinators had a day’s training on how to administer IA. In
offender learning: much of the training is in-house, although there does not
appear to be any consistency across the offender estate

        Telling the learners about undertaking the assessment
Most of the providers understand the need to introduce the IA in a sensitive
way. Recognition of the fact that time taken at this stage can be valuable is
noted, and that it is important to put learners at their ease, advising them that
‘this isn’t a test’, as the following quotes indicate:

       We say every time: you can't fail, it isn't a test, take your time, there is
       no time limit. We need to find out about your abilities and skills. We
       need to see what you can do without help.

       For numeracy, I always emphasise, if you get stuck, don't brood and
       give up, move on and try the next one.

One organisation ensured consistency in feedback as shown by the comment:
‘we have a script we read through.’

Despite the fact that it seems to be mandatory for a large percentage of
learners to take IAs, there are not many reported refusals, even in what may
be considered difficult contexts, such as the young offender estate. The
following quote indicates that teachers are sensitive to the needs of learners:
However, no one would be forced to take the assessment if they didn't want
to. Assessment could be done when they have settled into a group.

This attitude was also reflected in the adult offender estate, where tutors
noted that they would not administer an IA immediately if they felt it might be
too soon, or too stressful for a learner. In the offender estate, many potential
learners may arrive in an induction session where IA is completed showing
drug withdrawal symptoms, or other stress indicators.




                                                                                  13
3.3. How the results are used
The majority of learners are given feedback about their results immediately
after completing the IA, in a 1:1 situation rather than as a group.

Many providers were critical of the results’ print out from the computer-based
tools. They felt that in providing merely a print out of question numbers with
ticks and crosses against them, it was difficult for them to provide any
meaningful feedback to learners other than their identified level. Although the
Assessor Guide includes a table detailing the question number and the core
curriculum code and skill relevant to each question, this would appear to
either not have been of sufficient value to providers for feedback purposes
(given you can’t see the actual question but only get an indication of the skills
covered), or providers hadn’t accessed and understood the significance of the
information.

Many providers therefore opted to use the paper-based assessments as
these allowed them to have a discussion with the learners around particular
questions that they had got right or wrong and to begin to identify individual
learning needs.

Comments from all providers suggest that care is taken in giving feedback to
learners. Using learner responses to the assessment questions, they will
discuss learner aspirations and next steps on the learning journey, as well as
using the opportunity to begin the teaching and learning process.

       I focus on a couple of questions that the learner got wrong and discuss
       why the answer is wrong and what would have been the right
       response. You need to be positive about what the learner can do and
       honest about areas for improvement.

In some cases, the IA results were used to inform the appropriate course for
the learner or to lead onto diagnostic assessment.

       I have a discussion, and tell them I will follow up with the diagnostic, I
       will begin to plan the learning programme with them at this stage, it
       may change their own view of what they can do and need to do – e.g.
       a learner who thought her main problem was spelling but in fact the
       assessment showed it was punctuation. You can start to create an
       independent learner here.

In the case of the army, the majority of AECs will go through the answers with
their learners on a 1:1 basis, explaining the levels and where more practice is
needed. As the results of the IA can be high stakes in this context (a
requirement for promotion), sensitive feedback is usually given, as shown in
the comment:

       You need to be sensitive to their disappointment that they are not
       Level 2 – access to promotion is dependent on being Level1/2.




                                                                                 14
3.4 Learner responses to the assessment
The vast majority of learners asked about their experience of the IA felt it gave
a fair judgement of their skills. A minority thought that the assessment put
them at too low a level.

FE providers felt that roughly half of their learners felt that the assessment
was a positive experience, with few surprises. However, there were some
learners who found it stressful and demoralising. These contrasting views are
shown by the following two comments:

       It was their first experience of assessment, so there was really lots of
       anxiety. A lot kept saying self-deprecatory things like “I’m self-taught
       me, I never went to school, I can only write in capitals” etc.

       Most were reassured and found the assessment 'no big deal', they
       surprised at how well they did, positive because they know where they
       are and are ready to get going. However, this is due to the way the
       assessment process is handled and not just down to a particular tool.

There were concerns expressed about the length of the assessment, and, in
one case a NACRO centre identified that this had been a problem with one
group of learners. They did suggest that a save facility for the computer-based
assessment would have helped to overcome this.

       The JCP (adult) learners were fine with it. 8 E2E learners however did
       not complete. They couldn't sustain concentration long enough to
       complete it. They got frustrated when they couldn't pass a question
       they couldn't do and couldn't save what they'd done to go back to later.
       They were demotivated to carry on with the initial assessment process
       because of this. They would have liked to have been able to save
       partially completed assessments to go back to later.

One of the 3 ACL providers identified what they considered an important
issue, which echoed the feelings of learners in the Training Provider context:

       Existing learners came out at a level below the accreditation which
       they were working towards and felt upset

The difficulty of accessing the assessment for learners at lower levels was
also noted in this context. A strategy for supporting learners by reading out
the maths questions was suggested, but this would be time-consuming to do
for a large number of learners,

       Most were fine with it, but there was some resistance to the way that
       the numeracy questions were asked, i.e. too many words. They have
       found that some learners come out as E1 or below on this assessment
       who can do E2 calculations if they are helped with the reading.




                                                                                  15
3.5 The effectiveness of the initial assessment tools

        Strengths
The majority of providers (65%) felt that the tools worked very well or well with
their target group. Key strengths identified by providers are:

          The contextualised versions were generally welcomed and
           appreciated.
          Many providers felt that the skills covered were wider than the BSA tool
           and the skills were assessed more effectively.
            The inclusion of listening tasks was mainly commented upon
            favourably, and many providers felt it added value to the tool.
          Some of the providers felt that it gave a more reliable assessment of
           the skills, although others felt it didn’t reflect the real level of the learner
           (usually setting the level too low)
          Generally the computer-based assessment was found to be easy to
           use, and engaging for learners with clear graphics and an easy to
           access interface. Several positive comments were made about the
           interactivity.

       Weaknesses
35% of the providers felt that the tools worked OK or poorly, and some of the
comments are in direct conflict with strengths identified previously by others.
There were several comments regarding the level of difficulty of language
used, most particularly in the numeracy assessment, but also in the literacy
assessment for learners at the lower levels (too much text)
The lack of a save facility partway through the computer-based assessment
was seen to be a very clear negative.
The majority of providers were negative about the limited feedback available
from using the computer-based assessment. As one said A sheet with lots of
crosses on is depressing. An army education centre reported soldiers were
very frustrated, desperate to know what their mistakes were.

An offender learning provider said The print out doesn't give the spiky profile
graphical image that PLUS5 does. The PLUS printout is useful to support
positive feedback to lads by highlighting those areas where they are able to
use the skills well, rather than just focussing on an overall low level.

Comments similar to these were evident across all providers, indicating that
the guidance on interpreting computer-based results wasn’t clearly illustrated
in any guidance, nor signposted appropriately.

Many providers identify that their learners may not be able or want to
complete an assessment in one sitting, and the inability to save answers
completed was not helpful. Also there needed to be a facility to store results
electronically.



5
    A version of Target Skills developed for the offender learning context


                                                                                        16
Other weaknesses included the fact that the paper-based literacy in particular,
was deemed to be too long, have too many questions and be too time-
consuming to administer. Providers felt that fewer questions would still
provide relevant information. Some providers felt that the assessment, in
particular the numeracy assessment, came out at too low a level compared to
other assessments, or previous testing:

Other comments, which were mentioned by more than one provider, include:

      The numeracy assessment does not assess adequately at level 2 – it
       needs 'beefing up'
      Speaking and listening: language level too high, many students didn't
       understand the questions.
      Adult: concerns about tick box assessment, they can just guess and
       you won't get a true picture
      Some providers felt the paper-based version looks childish – too much
       space and the illustrations are childish

Although 47% of providers indicated they had had no problems with the audio
element of the assessment, there were some, mainly technical, issues
identified by the 53% of the other providers. These were identified as access
to IT equipment, headphones and/or tape recorders. Some providers who
didn’t have access to appropriate equipment read out the scripts to learners.

The Army reported that the audio element was not liked by the learners, and
that some feedback from learners suggested the speech was slow and boring,
so the learner ‘switched off’. The army is now experimenting with providing
MP3 players with the audio section and this is proving effective in engaging
the learners. One offender learning provider did feel that learners often didn’t
listen carefully, and this was a useful activity which raised awareness of the
need to pay attention. This was also mentioned by a Training provider:

A large majority of the providers did feel that the listening element was of
value (77%), despite the technical problems. However, one offender provider
felt that the listening element had caused problems for her learners:

       one tutor reported that when she introduced it to her learners they
       looked like 'rabbits caught in the headlights', confronting them with this
       straight away. She only used it a couple of times and then abandoned
       it. The speed is confusing. Some can be played twice, one can only be
       played once. Learners didn't like it, and one said 'either the questions
       are really tricky or I'm thick.’

The majority of providers hadn’t used the Good Practice Guidelines for skills
check and IA (52%). Those that had used it felt, in the main, that it was useful,
comprehensive and self explanatory.

       Comparison with other tools
All providers indicated that they had used IA tools previously. 36% indicated
they had used BSKB previously, 81% had used the BSA assessment


                                                                               17
previously, 20% had used Target Skills, and 26% had used another IA
previously (including the PLUS assessment in custody for young people aged
15-18). The majority of those who had used the BSA previously commented
that they felt this IA gave more detail on the skills, was more thorough in its
approach and covered a wider range of skills. They felt it provided a more
effective/appropriate assessment.

       Literacy covers a wider range of skills than the BSA tool. The tools are
       more contextualised with better lay out and presentation. However the
       numeracy tool needs strengthening. (FE)

       All have strengths and weaknesses. BSA is dated and learners have
       done it too often. (Training Provider)

       It’s good that it assesses to Level 2. It’s a more thorough assessment
       than the BSA, but with BSA you can see people’s writing. 1 learner did
       both the BSA and the paper-based numeracy. She found the new one
       harder. 2 learners did the on-line literacy and the BSA paper-based.
       They came out as E2 on both. (Army)

       The learners liked it, most thought the pictures helped their
       understanding. The ESOL student who found it too wordy, guessed
       from the pictures. (ACL)

       We prefer it to the BSA assessment because it is clearer, more
       relevant and assesses to L2. (offender learning)

Fewer providers had used the BKSB previously, although some were familiar
with, and had used, both the BSA and BSKB assessments. The majority of
providers who had used the BKSB tool felt the DfES IA was more
effective/appropriate.

       The paper version is better than BSA. The computer version is about
       the same as BKSB (which is what they use across the college) BKSB
       progression through the levels seems more gentle and logical. (FE)


       BSA is heavily weighted to reading and spelling; this has better
       coverage e.g. verbs. Assesses up to L2. Found it similar to BSKB.
       (Army)

20% of providers had used Target Skills previously, and the majority of these
reported that they felt this IA was more effective, although many qualified their
view with comments indicating that they felt Target Skills provided feedback
information that was valuable.

       The feedback is less effective than Target skills which provides a
       more useful and comprehensive feedback in that it gives a breakdown
       of performance against particular skills so identifies specific areas of
       strength and weakness. Also Learners can pass on questions and can


                                                                              18
       go back and finish off the assessment at a later date. Results can be
       saved. (Training provider)


3.6 Evaluation of the organisation’s use of the tools
All organisations felt that they used the tools in a friendly and non-threatening
way Many said that they offer a flexible approach by varying the activities and
the tools used for different vocational areas, providing a more relevant and
meaningful activity by which to identify the learner’s starting point.
They recognised that some staff need more training in how to deliver
assessments and feedback , particularly where non-SfL staff are doing this,
as is frequently the case in Train to Gain. They also identified that doing
thorough initial assessment is very time- and resource-consuming, and that
pressures of this kind can at worst lead to very mechanistic processes. They
also recognised that in some cases results are not made best use of,
particularly in contexts where there is no central database on which to record
them.

3.7 Recommendations to others
When asked what recommendations they would make to other users of the
tools, the following types of recommendation were made:
    It is all about quality, the start of the quality cycle – the learner’s
        impressions of you and of the process are so important. The tools are
        part of the process, not the be all and end all.
    You need to be clear and confident about how to administer it to
        ensure that the learners do not feel even more apprehensive.
    Take the test your self.
    Give feedback on the results immediately or ASAP afterwards, don’t
        leave this to a non specialist as it is important to be sensitive about
        responding to learners who may have been out of education for some
        time and have numerous barriers to overcome.
    Be familiar with the Core Curricula (Lit/ESOL and Numeracy) first.
    Be aware that the results are not fully reliable.
    Be aware that this is not appropriate for entry level 1 and below
        learners, undertaking the assessment would prove a highly
        demotivating experience.

3.8 Quality assurance.
When asked about how they quality assure the IA process, providers were
evenly spread in terms of their responses – 47% undertaking observations of
teaching and learning including some IA. 44% recognised there was no
quality assurance in this area or no formal record of assessment. Similar
figures were reported for the mention of IA in their SAR process

Some of the providers stated that they did not undertake any formal QA of the
assessment process whilst others had varying procedures in place such as
checking learner ILPs, internal verification and ensuring that assessment
forms were correctly completed.




                                                                               19
Many training providers recognised that a more structured approach was
needed and that QA was an area for new developments. Others recognised
the need for qualified SfL staff to deliver the assessments or at the least to
encourage regular meetings with other staff. Few providers used any
systematic way of capturing learners’ views of the process, except as a
general question in a learner survey.

3.9 A summary of the conclusions drawn from the responses to the
Initial Assessment tools.
The majority of the providers across all contexts appear to understand the
broad purpose of the initial assessment tools i.e. for a starting point, although
they didn’t always appreciate how it fitted in to the learner journey. They
recognised in the majority of cases the need for it to be a positive experience
by giving timely and sensitive feedback.

Providers either didn’t access or realise that there were tables mapping the
questions to the literacy and numeracy core curricula and skills and felt that
there was no feedback provided on the computer-based assessment.
The majority of providers felt that both the paper-based and computer-based
assessments were an improvement on the BSA IA.

There were arguments both for and against which version of the tool was
preferred. Some providers suggested they used the computer-based
assessment to provide them with an indictor of learners’ ability to use ICT
effectively. Other providers found the paper-based version enabled them to
gain further information about learner ability e.g. handwriting, observation of
facility in using a pen, etc. Many providers weren’t able to use the computer-
based assessment because of a lack of IT facilities, which limited choice of
assessment method for both provider and learner.

In the main, results are used to inform either the course/class that is suitable
for the learner, or to inform further assessment and the ILP. Occasionally
learners aren’t given their results. Usually, when results are given to learners,
it is done immediately after the assessment or as close as possible to it.

Providers feel that, in the main, they set the scene and feedback to learners in
a positive and supportive manner. It is usually linked to the course or
levelness and further development opportunities to make it meaningful to
learners. Many providers felt that additional activities would be helpful to
inform learner level e.g. free writing, interview, discussion. Few felt it was
sufficient on its own.

The context in which the IA is used, and its purpose, influences the way it is
delivered and the follow on for the learner, e.g. in the army it is used to
determine suitability for promotion, in Train to Gain it is a contractual
requirement for accessing the NVQ.

There does not appear to be any formal training available. Much of the
training in use of IA is delivered in house, but this isn’t universal across all



                                                                                   20
sectors. Many providers expressed an interest in and need for further/initial
training in using IA effectively.

The IA process doesn’t appear to be considered in the QA and SAR in a
systematic way for most providers, although it may be considered in other
areas such as lesson observations and induction. Many managers were
aware that it would be desirable but weren’t always confident in how this
should be approached.

The contextualised tools were particularly welcomed and more would be
desirable.




                                                                                21
4. Diagnostic Assessment tools

4.1 What are the tools?
A diagnostic assessment is an evaluation of a learner’s skills, strengths and
weaknesses. It gives a thorough indication not only of the level of an
individual skills, but also which specific areas of work they need to improve.

The DfES commissioned tools cover literacy, ESOL, dyslexia and numeracy.
Each subject pack contains:
    Paper-based task books for the learner
    A teacher administration manual
    A CD-ROM with the full version of all the paper-based tasks (and
      includes an ILP print-out facility.)

These tools have been in use since 2003 and were initially rolled with a
national training programme to support their use.

Overall summary of responses to the interview questions

4.2 How the diagnostic assessment tools are used.
The diagnostic assessments are used by providers in a variety of ways
according to the client group; to the programme they are on and to the
teacher’s skills and confidence.

In FE colleges and offender learning centres they are mainly used with
discrete groups of literacy numeracy and ESOL learners. Training providers
use them with E2E NEETs, and on full framework apprenticeships to
determine the level of Key Skills and also where a dyslexia need is identified.
One training provider has many trainees with disabilities and is developing
ways of providing support for the diagnostic assessment for example for
learners with visual impairment. In one ACL provider they are used in class for
ESOL, and in another for checking progress in the middle and at the end of
the course. A number of providers, including an Army training centre, prefer to
use the Move On practice test as a diagnostic.

Of the 21 providers returning responses, 20 had used the diagnostic
assessments previously, 19 use them in a classroom setting. 8 used paper
only, 4 used computer only and 8 used both. 15 used Literacy, 14 Numeracy,
6 ESOL and 8 Dyslexia. As they have been in use for over 3 years, it was not
possible to quantify total number of usages of each tool, although clearly
numbers would be very large.

The choice of paper or computer depended on a variety of factors – the
availability and suitability of IT equipment, the tutor’s experience, the learners’
preferences, the location (for example off-site provision). Some providers liked
the fact that the IT version is easy and quick to use and one college said they
used it to produce group data. Responses from offender learning included one
where: The CD-ROM version would be used if IT equipment was available,
particularly as it would engage some of the offenders in the assessment, who
may feel it is more adult: carrying less stigma. It would also print out the


                                                                                 22
results, and report on group data. Some of the earlier questions are too easy
and offenders get a bit fed up of repeating them. The computer-based tool is
adaptive, and would move on much quicker if offenders were competent at
the particular skill.

 Another offender learning provider preferred paper-based because: You can
observe whilst learners are completing it and can start to collect information
about the learner ability, even if informally e.g. how do they hold the pen? Are
they right or left handed? Do they write fluently? We prefer it for recording,
discussion and formative assessment purposes. It enables tutors to observe
and give useful feedback to the learner whilst supporting other members of
the group which wouldn't be possible on a computer. You can also use it as a
teaching and learning tool e.g. when they get stuck, and any help can be
recorded

The point at which the diagnostic assessment is used also varies. Some
providers used it in the first teaching session, or as soon as possible after
initial assessment. Others used it flexibly according to need, taking into
account how secure the learners’ skills and knowledge are. A training provider
used it At the beginning of the course, when using the dyslexia set, and at the
beginning of a block of learning around a particular topic for the literacy and
numeracy.

Practice ranges from giving the learner the entire booklet to selecting one or
two tasks. The rationale for selecting the whole booklet in one case was that
the assessor did not realise you could split it up, while in others it was more
learner-centred: Numeracy: we put the whole pack in front of the learners so
they can see it, but we talk them through how it is set out. Then we let them
determine their own way through the work – i.e. do they want to start with the
'easy' things they feel comfortable with, or challenges? It's down to the
learners’ confidence and learning style.

In some cases the provider selects the tasks from the diagnostic assessment
on the basis of the initial assessment – they choose tasks related to the skills
needs identified. Another one starts with the strengths and builds on those.

One FE provider describes the process as follows: Two tutors adopt largely a
blanket approach at the beginning of the course, batching the tasks e.g. doing
all the Number tasks at one time and all the Handling data tasks at another.
As tutors get to know the learners and their needs, they adopt a more
individualised approach to the assessments, selecting particular tasks for
particular learners based on their specific needs. A literacy tutor starts with
the free writing task which helps identify learners' needs and therefore which
diagnostic assessment tasks to select.

Some providers use an open-ended approach to timing the assessments,
giving the learners as long as they need and suggesting they stop if they
seem to be flagging. Some do them over several sessions. Where timings
were given they included: Max one hour at a time…the length of the classes
(2.5 hours)… 1.5 hours …as long as it takes, usually 2 hours plus


                                                                              23
4.3 How the results are used
In Offender Learning and FE some providers transfer the results directly into
the Individual Learning Plan. Others use the results as a basis for discussion
to draw up individual targets for the learning plan. In a probation service the
assessments are used both diagnostically and formatively, depending on the
learner and timescale. If completed in the session, the tutor will mark it, and
any key points noted on the ILP whilst the learner has been completing the
assessment will be discussed and agreed as targets, and the learner will then
sign the ILP. If this isn't possible in the session, the tutor will, with the learner's
agreement, take the DA to mark and bring the results to the next session. Any
notes made of suggested topics to work on will be discussed and agreed with
the learner then confirmed on the ILP.

The two ACL providers used the assessments to inform ILPs, in one case to
produce very detailed curriculum-referenced ILP targets (the group goal is
accreditation) plus soft targets.

The diagnostic assessments are also used to establish support needs. An
Army education centre uses the diagnostic assessments for all trainees on the
first morning of a course to decide on appropriate support for the course; a
training provider uses them to establish support needs on vocational courses.

The outcome of assessment impacts on teaching and learning In the following
ways:
In FE, the diagnostic assessment tool is used to help target planning and
teaching, informing schemes of work, and for differentiation within lesson
planning. One said it wasn’t as useful as it might be, particularly for students
assessed as ‘consolidating’: most students came into this group, but the
assessment tool doesn’t differentiate enough within this category. Most
teachers use other activities alongside diagnostic assessment tools to enrich
their lesson planning.

Responses varied significantly among training providers. In one, it was used
for designing teaching, in another it would make a difference if a group of
learners were identified as having the same weakness. One said it would
make no difference to planning as the scheme of work has already been
decided. The two Army evaluation sites using the diagnostic tool again had
different responses. One said It’s too long to be practicable, it doesn’t tell
you more than you already know, and the course is already designed, so
there is not a huge amount of flexibility to adapt delivery anyway. The other
Army centre recommended using it – their learners were positive and that it
was a good introduction to the course.

In three Offender Learning providers using the diagnostic tool, one uses the
paper-based version for pedagogical purposes: The numeracy is used across
all levels and there is a discussion with the learner about how they want to
proceed. They can opt to complete the skills in any order, so they could start
with what they feel are the easier questions, and then try the harder ones. It is
up to the learner to decide. With the literacy, there is more selection about the


                                                                                    24
tasks and levels that are used with learners. Tutors in this centre observe
learners carrying out the assessment and gather richer information about
them than the tool on its own could provide.

Another centre would prefer to use the computer-based version, as they feel
their younger learners would respond well to it, but they don’t have enough
computers. The same problem applies in the third centre, which says that
learners are fed up of doing paper-based tests. The second centre uses the
whole booklet, as this seems to work better. They are very positive about the
visual design of the materials, but say that there are too many easy questions
at E3. The third centre uses the whole booklet as well, but this is because
they didn’t realise they could split it up.

In general, the diagnostic tools appear to have strong impact on teaching and
learning in the FE and ACL evaluation sites, and a weaker impact in TP, Army
and Offender learning sites.

4.4 Learner responses to the process
There was a very wide range of responses to the assessments from learners,
and these were not related to the sector they were in. Some learners felt they
were too tiring, long-winded and time-consuming, with too many questions.
Some did not like paper-based assessments, preferring to do it on the
computer, others did not like the computer assessment. Some learners found
some of the questions difficult to understand.

One assessor in a training provider said that the trainees were initially
surprised to be asked to do this at work but found it useful and related to the
work context when they actually did it. The style and format were professional
and made them ‘feel valued. Some learners gave neutral reactions: ‘no
complaints’, others were much more positive and enjoyed doing them. One
training provider said that the learners appreciated the fact that someone had
tried to get to the bottom of their difficulties and that they could see where
their learning programme came from.

4.5 The effectiveness of the tools
The FE evaluation sites were positive about the visual design, graphics and
the way tasks are paced out in the tools. They found the reports useful, but
would have liked them to provide more detail and differentiation within the
three categories of ‘emerging, consolidating, and established’, particularly
‘consolidating’, where most students were assessed as being. They liked the
variety of activities, and understood that they could be used flexibly.

The training providers were positive about the contextualisation of the tools,
and the curriculum references. They don’t treat the students like idiots, said
one. They also liked the facility for electronic storage of the results. The Army
evaluation sites felt that strengths of the tools were ease of administration,
clear instructions, and their good visual appearance. ACL sites found the
guidelines useful and welcomed the curriculum references. Offender learning
sites also welcomed the clear, user-friendly appearance of the tools, found



                                                                               25
them easy to mark, and reasonably accurate. One particularly liked the real-
life feel of the literacy tasks at L1 and L2.

Evaluation sites felt that weaknesses of the tools included:
    Too many repetitive and easy questions at low levels (particularly
      Tasks 1 and 3);
    The language levels of some of the questions and the reports were too
      hard for E1 and E2 learners who found them very time-consuming.
      Staff felt that for low level students, they can reinforce the learner’s
      sense of inadequacy.
    The spelling section inadequate, as it doesn’t assess important spelling
      skills such as proofreading. We feel our own approach to writing tasks
      is better. Sentence section is too test-like, and punctuation questions
      are not clear enough.
    An ACL site says that: In relation to ESOL, the writing tasks are not
      differentiated enough – low levels don’t know what to say, higher levels
      don’t know when to stop. The listening tasks are OK, but at lower levels
      it’s too slow, repetitious, and the questions are too long. For speaking
      we use our own procedures, this one is much too lengthy. Doing it in
      pairs is more effective and models their qualification better.
    In some ACL sites, guidelines about using it in parts are not always
      understood by inexperienced teachers.
    The main response of the two Army sites using the tool is that it takes
      too long to be practical.
    Training provider sites felt that the tool’s weaknesses included that the
      print-out doesn’t refer easily back to material, so that it’s unnecessarily
      fiddly and time-consuming to see which questions they got wrong,
      which makes it harder to use for feedback. One felt that the guidance is
      inadequate for inexperienced teachers. They also complained that it is
      more time-consuming than other tools, and fiddly to set up. They
      pointed out that because the learner can see the number of questions
      to be answered at the start of the assessment it can be rather
      daunting/off-putting.
    FE sites consulted felt that the materials don’t need to be in colour, that
      black and white graphics would improve photocopies. They also feel
      that the reports could be more condensed, and split into sections by
      level, to save on photocopying and paper. They feel that the paper-
      based version must be available, as some learners are terrified of
      computers.
    One said that there are problems with the numeracy questions, which
      are sometimes too long and complicated for good comprehension. I
      prefer my own spelling diagnostic questions, and don’t use the
      grammar and punctuation sections.
    Another pointed out that the printed ILP gives the same targets for
      every learner!
    An Offender Learning centre felt that the tool is not age appropriate for
      younger learners, and that it is not suitable for student with special
      needs.




                                                                              26
      Finally it was suggested that the ‘emerging, consolidating, established’
       typology is just not sensitive enough, particularly for ‘consolidating. I
       have found that about 70% of students come out in this band. The tool
       needs to be combined with other exercises such as free writing, and
       the teacher should make the assessment over time – this would be
       more reliable than using the tool on its own.

4.6 Recommendations to others
Most evaluation sites across all sectors recommended that teachers should
familiarise themselves with the tool before they use it, and only use the parts
that they need. One FE site said Only use it if you are going to do something
with the results.

In general, they feel that only specifically trained teachers should use it, and
that observing student using it will enrich the information from the tool that
teachers can use for diagnostic assessment. An offender learning site agreed:
Be prepared to spend time on discussion and pick up learning points from it.
This site added: Don’t get too upset about the terminology e.g. emerging,
consolidating, but use common sense. Don’t get hung up on the results, they
may not be perfectly correct.

More than one site pointed out that however teachers introduce the
assessment exercise, some students will still see it as a ‘test’ – the
appearance is different from other assessment exercises they might have
done, but still can be seen as a test, and in some places is unfortunately used
as one.

4.7 Quality assurance
Most FE evaluation sites use formal observations of assessment work. For
most it is a specific feature of the SAR. If you don't do IA and DA accurately
you will not get a good ILP. If we get assessment right, all outputs will be
improved. All colleges use the results of assessment for formative feedback
and for planning teaching and learning. All think that using qualified and
experienced staff is important.

Some of the training provider sites visited use observations, some are
planning to introduce observations, and some rely on qualified and
experienced staff, but in one, all that happened is that the staff doing
assessments tried them out themselves, and made deliberate mistakes to see
what would happen. In Army sites, little use was made of observations, and
for most it is not a feature of SARs.

One out of three ACL sites uses observations for quality assurance of
assessments. They mainly rely on experienced teachers. At present, it tends
not to feature in SARs, but will do following the involvement in this project.

Most Offender Learning sites use informal observations of assessment, and
for most it is carried out as part of observations of teaching. One has
specialist IAG staff, and one doesn’t do observations. There were varied
responses from Offender learning sites on the extent to which QA of


                                                                                 27
assessment is built into QA cycle generally, from very low to very high. 1 was
convinced that the DA has contributed to improved performance, and the
other two were hopeful that it would.

In general, FE sites were more likely to have formal systematic QA for initial
and diagnostic assessment, and for it to be specified in SARs. In the other
sectors sites were most likely to have either informal QA or none at all, and
assessment is less likely to be featured explicitly in SARs.

4.8 Summary of conclusions drawn from the responses to the
    diagnostic assessments
When used in a learner-centred way, the diagnostic tools are used flexibly in
terms of selection of task, timing and purpose. Factors which may be taken
into account are:

                 o Specific needs – dyslexia, ESOL
                 o Literacy and numeracy needs identified through the IA
                 o Learners’ confidence and learning preferences – how do
                   they want to select, do they want to start with the easy
                   tasks, or be challenged?
                 o Need to supplement with own materials
                 o Need to adapt for learners with disabilities
                 o Whether for discrete or vocational (Key Skills)
                 o Identifying group as well as individual needs –
                   differentiation
                 o How ILPs and target-setting are used

Sometimes it is appropriate to give a learner the whole booklet to work
through, but in some cases this is done because the tutor is not sufficiently
trained or experience to make a reasoned selection.

More research needs to be done into how the diagnostics are used to draw up
individual learning plans, and what is good practice. In some cases the
printouts are used mechanistically to generate the learning plan, in others the
diagnostic tools are used as part of a range of approaches to identifying
learning needs and preferences. So much depends on how experienced the
tutor is and how well they manage the process in relation to the individual
learner and group.

In terms of the choice of computer-based or paper-based versions, there are
advantages to using the paper-based version because it is more flexible, but
also because tutors can be more closely involved, observing the process and
how the learners engage with it. This can produce a richer range of
information with which the teacher can assess the student’s profile more
accurately and sensitively. In these circumstances feedback to the learner will
be improved as well. In some situations the computer-based version of the
tool may be preferred because it is ‘more adult’ and makes learners feel less
singled out. Providers like the fact that the computer-based assessment will
allow them to identify group needs.



                                                                                 28
5. Key findings of the research
The following points summarise the key findings:
     The context in which the initial assessment is used and its purpose
       influences the way it is delivered and the follow on for the learner.
       Context and purpose are more influential than sector in determining the
       value of the process.4
     How and why the tools are used has a more important critical impact
       than the tools themselves
     Policy should not allow or encourage initial assessment to be
       separated from teaching and learning provision. Organisations
       providing initial assessment (as opposed to skills checking) should
       normally be the organisation providing teaching and learning. This
       would encourage the best use of the tools for supporting good
       pedagogy.
     Where practitioners were experienced and well qualified, they had the
       confidence and expertise to use the tools appropriately and flexibly to
       meet individual needs. The tools were seen to help placement and to
       inform teaching, which in turn impacted favourably on retention and
       achievement.
     Where the tools were used most mechanistically, there was the
       greatest risk of their being least useful and indeed potentially harmful to
       learners' confidence.
     Quality assurance mechanisms often did not cover the IA process and
       few opportunities were seen for feedback from learners
     Providers felt that, in the main, the tools were well presented, engaging
       and fit for purpose. They particularly welcomed the vocational /
       workplace contextualisation.




4
    See Appendix C Models of Use


                                                                               29
6. Recommendations

Further development of the tools
    Further contextualisation, for example to a wider range of vocational
      areas, would be welcomed
    It is important that a choice of paper-based and computer-based tools
      continues to be available. Many providers were not able to use the
      computer-based assessment due to lack of sufficient resources
    Reduce the time required to take both literacy and numeracy initial
      assessments to a maximum of 90 minutes. In practice, many learners
      take both assessments in one session
    Consider adding a wider range of questions at Level 2 numeracy
    An easy to access ‘at-a-glance’ view of the answers for marking should
      be included in the Assessor Guide
    To support practitioners to give useful feedback from computer-based
      initial assessment:
           o Include a search facility to enable tutors to pull up specific
               questions
           o Include detail of the skills assessed against each question in the
               results print out
    Include a save facility in the computer-based initial assessment tools
    Increase the functionality of the computer-based initial assessment
      tools to allow a learner to go forward, miss questions and go back to
      them
   See appendix D for further issues relating to particular questions

Support for their effective use in practice
Support for quality improvement needs to be process-focused and interactive
(for example by mentoring or action research) Few of those interviewed had
made effective use of the written guidance provided.
     Focus for support:
          o Using the tools effectively particularly within E2E and Train to
             Gain programmes
          o Flexible and creative use of the tools within a learner-centred
             assessment process – ways of adding value to the initial and
             diagnostic assessment with additional activities
          o Recognising when the use of the tool is inappropriate and
             identifying more appropriate assessment methods, for example
             for learners at Entry 1
          o Effective diagnostic assessment of learner needs
          o Using diagnostic assessment to draw up individual learning
             plans
          o Support for non specialist practitioners in the use of appropriate
             tools within a learner-centred assessment process and in
             ensuring that results inform support and teaching and learning
          o Support for managers on self assessment and strategies for
             quality improvement of initial and diagnostic assessment




                                                                            30
Case study 1: Process issues

This case study looks at process issues involving initial assessment in two
contrasting situations. It aims to illuminate the pros and cons of separating
initial assessment activities from teaching and learning. In one case, the two
stages of the learning journey are separate in terms of where they take place,
when they take place, and who carries them out. In the other, initial and
diagnostic assessment, and teaching and learning, take place in the same
organisation, closely together in time, and carried out by members of the
same team, or maybe even the same person.

In case one, a training provider is carrying out initial assessments for learners
referred by a neighbouring Job Centre. The Job Centre has identified the
learners as needing skills for life learning, and sends them to the training
provider for initial assessment. The continued payment of benefits for the
learner depends on them attending the training provider for initial assessment
and returning to the Job Centre at a later date, and most learners arrive at the
training provider soon after being referred, often on the same day.

The training provider does not know how many will be referred on a particular
day, and has to deal with the learners as and when they appear. They have a
small room with 6 computers in it. Usually a specialist assessor carries out
the assessment, someone specifically trained in house to do the job, although
occasionally it has to be an administrator. The assessor explains that this is
an assessment, that the learner cannot pass or fail, and that its purpose is to
determine the strengths and weaknesses of the learner’s skills. Learners are
encouraged to ask questions. They have the opportunity to choose to do a
computerised or paper-based version of the assessment. They then sit in
front of the computers or at the table in the same room, and start.

Usually they do both literacy and numeracy assessments one after the other.
On average they take about 30 minutes for each, but on occasion each part
can take up to an hour. The results are given to the learners immediately
after they finish if they have done the computerised version, or marked
immediately if the paper-based version has been used. There is no real
opportunity to involve the learner in discussions about the significance of the
results or about next steps, because the results print-out doesn’t give any
feedback information, simply a level. Also, because it is not in the training
provider’s remit to provide feedback, often they do not. The result is then sent
by email to the relevant learning advisor at the Job Centre. Even if the
assessor has the time to give formative feedback, they have little useful
information on which to base it.

At the next stage of the learning journey, the Job Centre advisor will have just
one piece of information that they didn’t have before, when they make
decisions about the next steps. Again, they will have no information on which
to base feedback to help with learning. Usually the learner will then be
referred to a learning provider, which may very well not be the training
provider which assessed them. In this case, we can see that the assessment
process is emphatically divorced from teaching and learning, and there is


                                                                               31
virtually no emphasis on formative feedback. In other words, the assessment
exercise has had no pedagogical content. A future provider of learning for this
learner will have minimal useful information with which to plan a learning
programme. If for any reason the learner does not take up a learning
opportunity, they will have nothing useful or constructive to show for the
exercise.

In the other situation, the learner is assessed in the same centre where they
will be offered a learning opportunity. In this case, time spent by the assessor
in exploring the implications and details of the assessment, and giving
feedback to the learner, will be time well-spent, because it can be
communicated easily to the teacher and be used for planning the learning
programme. In effect, in this situation the assessment process is merely an
initial stage of the teaching and learning itself. The assessment process is
used to its full potential, and the process for the learner is constructive and
useful even in the event of them not taking up the learning opportunity, should
that be the case. Teaching and learning is materially supported in this case
by the assessment process, in a way which is virtually impossible when the
two stages of the journey are separated. These two cases together support a
case for guidelines which clearly discourage the separation of initial
assessment from teaching and learning, in this way, whenever this can be
avoided.


       Case Study 2: Using IA in the workplace

Royal Bournemouth and Christchurch Hospital Trust

The paper-based Initial Assessments in Literacy and Numeracy (Skills for
Health) have been used and evaluated in the three-week induction
programme for Health Care Assistants. These are employees new to the
Trust. It is necessary to find out before HCAs go on to the ward that they can
do the literacy and numeracy tasks required for the job – for example reading
charts, writing reports. They need to be able to write accurately and legibly,
and to carry out calculations such as BMI (Body Mass Index). If the
assessment shows that they do not have the required level of skill in any
aspect, support is provided to meet that need. The Initial Assessment is also
used to assess levels of skill for further training – many HCAs want to go on to
nursing training.

The Training Manager said that she was surprised at some of the needs that
the new assessment tools revealed as they showed that there were some
aspects of numeracy that employees found difficult though she would have
expected them to be able to cope with them. These would now be built into
the overall induction programme to ensure all new employees had the
required skills.

In the evaluation the staff and employees liked the fact that the IA was job-
related. It is made clear to the HCAs that is not a pass/fail test and will not
affect their employment – it is to ensure that they are equipped to do the job.


                                                                              32
The Training Manager said that their previous experience in using initial
assessment for literacy and numeracy was good for employees’ confidence –
they may come in worrying about their Maths. They are given the
assessment, and if they need further training that is given in confidence. They
go on to get their Level 1 Numeracy and after that ‘You can see them fly’.

The Training Manager also stressed the benefits to the organisation of
accurate initial assessment – the required skill levels are crucial to safety, and
as many complaints received by the Trust involve poor communication,
improving staff skills in communication improves quality and satisfaction
overall




                                                                               33
Appendix A: tools used in the evaluation

Those tools marked * are available as computer-based tools.
(Product order reference codes are included in brackets)

Screening Tool – Skills Check
Literacy and Numeracy Skills Check (SCLN) *
Workplace Literacy and Numeracy Skills Check (SCLN-W) *
Passenger Transport Literacy and Numeracy Skills Check (SCLN-RPTI)*
Asset Literacy and Numeracy Skills Check (SCLN-AS)*
Skillsmart Literacy and Numeracy Skills Check (SCLN-SS)
Skills for Health Literacy and Numeracy Skills Check (SCLN-SH)

Initial Assessment
Literacy Initial Assessment (IALIT) *
Numeracy Initial Assessment (IANUM) *
Workplace Literacy Initial Assessment (IALIT-W) *
Workplace Numeracy Initial Assessment (IANUM-W) *
Passenger Transport Literacy Initial Assessment (IALIT-RPTI) *
Passenger Transport Numeracy Initial Assessment (IANUM-RPTI) *
Asset Literacy Initial Assessment (IALIT-AS)*
Asset Numeracy Initial Assessment (IANUM-AS)*
Skillsmart Literacy Initial Assessment (IALIT-SS)
Skillsmart Numeracy Initial Assessment (IANUM-SS)
Skills for Health Literacy Initial Assessment (IALIT-SH)
Skills for Health Numeracy Initial Assessment (IANUM-SH)

Diagnostic Assessment
Literacy (DAM1)*
Numeracy (DAM2)*
ESOL (DAM3)*
Dyslexia (DAM4)*




                                                                      34
Appendix B: Evaluation sites

                                                                         SC -                 IA - lit   IA -num
                                                             SC          assess   IA -        assess     assess    Diagno   Manager          Learner          Observ
Sector              Name                                     interview   ment     interview   ment       ment      stics    interviews       interviews       ations
                    Community Learning and Skills Service,
ACL                 London Borough of Waltham Forest                 0                    1                    9        1                                 0        0
                    Continuing Education & Training
ACL                 Service (CETS)                                   0                    1         40        30        0                1                1        0
                    Derby City Council Adult Learning
ACL                 Service                                          1       32           1         17        12        1                1                1        2
ACL                 Total                                            1       32           3         57        51        2                2                2        2
Army                AEC (Army) North                                 0                    1         27        24        1                1                0        0
Army                AEC Colchester                                   0                    1         26        29        0                1                0        0
Army                AEC Bramcote – Nuneaton                          0                    1          3        17        0                1                1        0
Army                AEC Chatham, Kent                                0                    1         20        20        1                1                0        0
Army                AEC London                                       0                    1         22         7        0                1                0        0
Army                Total                                            0        0           5         98        97        2                5                1        0
FE                  Bournemouth & Poole College of FE                0                    2         63        44        0                1                0        0
FE                  Central Sussex College                           0                    1         27        10        4                1                1        0
FE                  Loughborough College                             1      100           1         20         0        1                1                1        1
FE                  New College Nottingham                           0                    1         19         0        1                1                0        0
FE                  Oldham College                                   0                    2        109       111        0                1                1        1
FE                  Stroud College                                   0                    1         12         5        1                1                1        1
FE                  West Nottingham College (WNC)                    0                    1         30         6        1                1                0        0
FE                  Total                                            1      100           9        280       176        8                7                4        3
Offender Learning   A4E Ltd HMP Littlehey                            0                    1         95        89        0                1                1        1
Offender Learning   A4E Ltd: HMP Whitemoor                           1       20           0                             0                1                0        1
Offender Learning   Carter & Carter. HMYOI Werrington                0                    1         43        43        0                1                1        1
                    CfBT Education Trust – HMYOI
Offender Learning   Huntercombe                                      0                    1          6                  1                1                0        0
Offender Learning   City College Manchester                          0                    0                             1                1                0        0
                    Josephine Butler Unit HMYOI
Offender Learning   Downview                                         0                    1         11        11        0                1                0        0
Offender Learning   PALS Project, Dyslexia Action                    0                    1         57         8        1                1                0        0
                    Sir Evelyn House Juvenile Unit, HMYOI
Offender Learning   Cookham Wood                                     1        2           1          2         2        0                1                0        0
Offender Learning   Red Kite Learning                                0        0                              160                                                   0

                                                                                                                                                                       35
                                                                    SC -                   IA - lit     IA -num
                                                       SC           assess    IA -         assess       assess     Diagno    Manager       Learner          Observ
Sector              Name                               interview    ment      interview    ment         ment       stics     interviews    interviews       ations
Offender Learning   Total                                       2        22            6         214         313         3             8                2          3
Training Provider   A4E Ltd Limehouse                           1         2            1           12                    0             1                0          0
Training Provider   A4E Woolwich                                1        16            1           45        45          0             1                0          0
Training Provider   Achievement Training                        1        16            1           15        15          1             1                1          0
Training Provider   Basic Training Centre Ltd                   1         5            1            5         5          1             1                0          0
Training Provider   B-Skill Ltd                                 1                      3            8                    0             1                1          0
Training Provider   CFBT T/A Include                            0                      1            9         9          1             1                1          0
Training Provider   Milltech                                    0                      1           23        18          0             1                1          1
Training Provider   Nacro Dorset                                0                      1           14        14          1             1                0          0
Training Provider   Nacro Newcastle                             1       33             1           47        56          0             1                1          0
Training Provider   Paragon & ITE Training Group Ltd            0                      1           16        16          1             1                0          0
Training Provider   RBCH Foundation Trust                       0                      1           10        10          0             1                1          0
Training Provider   Red Kite Learning                           0                      1           45        80          0             1                1          0
Training Provider   Roundabout Training Ltd                     1        5             1            4                    1             1                1          0
Training Provider   Work Solutions                              0                      1            5        12          1             1                1          0
Training Provider   Total                                       7       77            16         258        280          7            14                9          1

TOTALS                                                        11       231           39       907           917        22             36            18            9
                                                                                           IA
                                                                                           TOTAL           1824




                                                                                                                                                                       36
Appendix C: Three models of use of the assessment tools

1.

           Skills Check             Purpose: placement onto
       Initial Assessment                                                   Discrete
                                    discrete levelled provision                SfL
                 in
     FE, ACL, Army, Prisons                                                 provision



Characteristics of    Predictable patterns of take up e.g. advice sessions; smaller
context and use       volume; experienced staff; learners on longer programmes;
                      generally learners self-referred.

Strengths             Process directly linked to learning; practitioners have expertise
                      and freedom to supplement the tools with other methods; time
                      for immediate feedback and referral in provision; informed sign
                      posters – expertise to interpret results.
                      Organisationally straightforward but pedagogically demanding

Weaknesses            Resources: heavy on staff time for 1:1 particularly with loss of
                      3/6 hr funding in some sectors (never had it in prisons).



2.

                          Purpose to identify appropriate
                          level of main programme and/or              Train to Gain
                          support needs
 Initial Assessment

                                                                   Vocational course



Characteristics       SfL not main learning aim; higher volume at particular points eg
                      September enrolment ; unpredictable and/or continuous
                      provision; learners needing to see link between IA and main
                      learning; not self referring; not necessarily done by SfL
                      specialists or trained staff; varied lengths of programmes (as
                      short as 20 hours.)

Strengths             Contextualised/workplace tools supports relevance.

Weaknesses            Results not always used to inform learning; lack of
                      understanding of the results by non-specialist staff;
                      inappropriate venue that doesn’t allow for confidentiality;
                      untrained staff; inappropriate carrying out of assessment.




                                                                                        37
3.



                       To fulfil
                       contractual
                       requirements
     Organisation A    for 1A
                                       Organisation B
 JCP; TG; Prisons;
       Army




Characteristics       Separation of IA from what happens next; often no
                      feedback; tool unsupported by any other method; often
                      inappropriate tools specified by contract; not part of a
                      formative pedagogical process.

Strengths             Quick and dirty – perhaps better than nothing.

Weaknesses            Mechanistic/bureaucratic; often repeat assessments; not
                      good experience for the learner whilst fulfilling contract
                      requirements.




                                                                                 38
Appendix D

Issues relating to particular questions
Some questions are inaccurate or misleading. Providers raised the following
issues relating to particular questions:

Initial assessment tools:
 In paper-based numeracy (standard), there are no questions, for example,
    on area or volume. The few questions that are at level 2 focus on the
    simpler areas such as dates (question 21 which is not a good question)
    and directed numbers (question 24)
 In paper-based numeracy (standard), questions 1 and 2 were considered
    to be too difficult as initial questions. Several commented particularly on
    the inappropriateness of question 1 as the first question of the
    assessment. It was considered too word reliant; something very simply
    numeric and visual such as counting was considered more appropriate.
 In paper-based numeracy (generic workplace) question 1, learners are
    asked to count people in a picture. One head is drawn from the back and
    was missed by 7 out of 9 staff in one provider.
 In paper-based literacy (generic workplace), the answer provided for
    question 1 on the listening test is incorrect. The answer should be
    ‘gardener’ not ‘waiter’.
 Some questions were considered ambiguous or confusing.
    Paper-based numeracy (standard):
      Question 3 is in part based on knowledge of how storeys in a building
        are numbered and people do not necessarily know that they are
        starting on the ground floor. In a prison the 'ones' are usually on the
        ground floor, the 'twos' on the first floor etc. In America the first floor is
        on the ground. There are other examples that could be used that do
        not have these cultural bias issues.
      Question 17 does not include any units.
      Question 21 can be answered as 19th or 20th April. Most practitioners
        thought the correct answer was the 20th April as colloquially a week's
        holiday is from Saturday to Saturday. This is an unnecessarily
        confusing question.
     Paper-based literacy (standard)
       Question 36 If learners do not add the date (14) tutors do not know
         whether to mark right or wrong. They suggest that if it is essential that
         the date is included then this is made more obvious. At the moment it
         is easy to miss it out and this is not relevant to the skill being
         assessed.
      Question 24 should be reworded to say, “There should be two full
        stops, where does the other one go?”
      Paper based literacy question 38 should state how many punctuation
        errors there are.




                                                                                   39
Diagnostic assessment tools
Some providers felt that:
      there are too many repetitive and easy questions at low levels
         (particularly Tasks 1 and 3)
      the language levels of some of the questions and the reports are too
         hard for E1 and E2 learners and risk reinforcing a sense of
         inadequacy.
      the spelling section does not assess important spelling skills such as
         proofreading.
      the sentence section is too test-like
      the punctuation questions are not clear enough.
 An ACL site, evaluating the ESOL diagnostic assessment materials, stated
   that:
            o “The writing tasks are not differentiated enough - low levels don’t
                know what to say, higher levels don’t know when to stop.”
            o “The listening tasks are OK, but at lower levels it’s too slow,
                repetitious, and the questions are too long.”
            o “The speaking section is much too lengthy. For speaking we use
                our own procedures. Doing the assessment in pairs is more
                effective and models the qualification better.”




                                                                              40

								
To top