Docstoc

Newham

Document Sample
Newham Powered By Docstoc
					                                                  Report on


                     Review and evaluation of the
       Newham Council Social Regeneration Unit


             “Where IT’s @” project

                             Paul Ticher: June 2006
                       (with minor factual corrections, September 2006)




Contents
       1.    Summary ...................................................................................... 2
       2.    Background and brief history of the Where it‘s @ project .......... 3
       3.    Where IT‘s @ evaluation criteria ................................................. 9
       4.    The software ............................................................................... 10
       5.    Consultancy and technical support ........................................... 16
       6.    Broadband provision .................................................................. 19
       7.    The training programme ............................................................ 21
       8.    Project administration ............................................................... 24
       9.    Overall impact ............................................................................ 24
       10.   Conclusions ................................................................................. 26
       11.   Additional lessons ...................................................................... 29
             Appendix A: Agencies participating in Where IT's @ ............... 31
             Appendix B: Review methodology ............................................. 32
             Appendix C: Questionnaire ........................................................ 35



Acknowledgements
We would like to thank all those managers and staff in the participating agencies who
gave their time and patience to answer so many questions on the telephone or by
questionnaire, and the staff of the SRU for their support and interest in the review, for
their openness about the events during the project, and for providing copious amounts
of valuable written information.
                                                  2



1.     Summary
On the evidence of this research, the Where IT's @ project has overall demonstrated a
successful model for supporting ICT1 development, at the local level, over an extended
period.

Since 2002 the project has provided a well-thought-out package of software, training
and other ICT inputs to some twenty advice agencies in Newham, in order to give
them better tools for carrying out their work and better access to relevant information.

Inevitably, some things did not go as well as hoped. The project managers responded
appropriately to these events and made changes in the light of experience and in
response to feedback from participants. It is to be hoped that other projects will be
able to draw lessons both from those parts of the project that worked well and from
the issues that arose.

This review is based on documentation provided by the SRU, interviews with agency
managers and a questionnaire survey of agency staff. It aimed to look at each element
of the package separately, and also to find out if the project overall has provided:
    benefits to the agencies using the system;
    benefits to service users;
    better information about service users and their needs.

In the agencies where the project has worked well, it has achieved all of these. One of
the key factors affecting whether it has worked well or not is the level of management
commitment to the project in the particular agency. For those who could see the
benefits and who were in a situation where they could devote time and energy to the
project, the Where IT's @ package proved extremely effective, even transformational
for the agency.

Other agencies also benefited, but to a lesser extent. Factors which reduced the
impact of the project were largely outside the control of the SRU. These included
changes within agencies, and external events which made the project less relevant to
them or which diverted their attention. Nevertheless, the majority of agencies which
participated in the project benefited from it.

The package consisted of four elements:
   Software
   Training
   Consultancy and technical support
   Broadband provision

This review finds that all elements of the package were appropriately conceived and
well delivered, with the partial exception of the broadband provision, where technical
problems delayed the start of the project and continued to dog it for some time.

1    Information and Communications Technology — essentially computers, their connections to each
     other and to the internet and software.
                                                     3


2.      Background and brief history of the Where it’s @ project
The Where IT's @ project of Newham Council‘s Social Regeneration Unit (SRU) began
in January 2002, and has been financed throughout from the Neighbourhood Renewal
Fund. It had thus been in existence for just over four years when this review was
commissioned, in March 2006.


What the project was about
The declared objective of Where IT's @ was ―to enable advice agencies in Newham to
take advantage of new legal software which was launched during the late 1990s and to
get a good quality connection to the internet to take advantage of free information and
resources and e-mail‖.

Four pieces of software were to be made available:
   AIMS, for case recording, a Lasa2 product.
   the Lisson Grove benefits and tax credits calculation software.
   an electronic version of the benefits information in the CPAG3 handbook.
   the Citzens‘ Advice Electronic Information System, now known as AdviserNet.

In order to allow them access to the software, each of the products was installed on
dedicated servers based at the council‘s ICT depot, where they could all be kept
regularly up to date. Each participating agency would access this via a broadband
connection. When the project started in 2002 broadband was a new technology. For
technical reasons it was decided that agencies should be provided with higher-capacity
SDSL links, rather than the consumer- and small-business-oriented ADSL4 which is
now more common. Installation of the broadband links was overseen by the council‘s
technical staff.

In addition, the project needed to ensure that the ICT infrastructure within each
agency was sufficient to allow all relevant staff access to the broadband connection. In
practice this mainly required that the agency had a suitable network, and if necessary
they received a limited amount of technical support, bought in by the project, to set up
peer-to-peer networks. Funding for any hardware needed was not directly available
from the project, but advice was given on funding sources.

As part of the overall intention of improving their use of ICT, each agency that wanted
one also received a general ICT healthcheck, leading to recommendations on which it
could base its ICT investment strategy and improve its ICT management. In 2002
these healthchecks were carried out by Lasa; for those agencies recruited in 2004–05
they were carried out by independent consultant John Pipal. As they were
confidential to each agency, it has not been possible to see any of these reports or get
an idea of their scope or content.



2    London Advice Services Alliance (www.lasa.org.uk).
3    Child Poverty Action Group.
4    SDSL is a ―synchronous digital subscriber line‖, as opposed to the ―ansynchronous‖ line provided by
     ADSL, in which the speed in one direction is significantly slower than in the other direction.
                                                    4


Finally, all the participating agencies were offered training, from Lasa and SRU staff,
in the use of the software being provided and also in using the internet for advisers —
since a consequence of the broadband provision was that all agencies would have
excellent internet connections.

In addition to the Where IT‘s @ support, participating agencies had access to the SRU
web site (provided through the council extranet), which makes certain SRU services
available to external statutory and voluntary agencies. This web site is the subject of
a separate review report.


Selection of agencies, and their characteristics
In preparation for the Where IT's @ project, in 2001 a survey of ICT in 38 advice
agencies in the borough was commissioned by the SRU from the Lasa Information
Systems Team.

Twenty agencies were then selected for Where IT's @ in 2002 (including Stratford
Advice Arcade, which was not an advice agency, but which hosted several agencies
and helped to provide their infrastructure). The selection criteria were uncomplicated:
each candidate agency had to have at least one PC and either to have achieved the
government Quality Mark for advice-giving, or to have the intention of applying for it.
(See Appendix A for details of which agencies participated at various times.)

The agencies were generally small, with very undeveloped ICT. A few key figures
from the survey for those eventually selected for Where IT's @ are5:

                                                    Average              Range
Full-time paid staff                                   3½                0 – 20
Part-time paid staff                                    2                 0–6
Volunteers                                            10½               0 – 100
Number of computers                                     4                0 – 15
                                 None             Peer-to-peer           Server
Network (2+ computers)             6                    4          2 (1 not working)
                                 None               Partial*              Full*
Internet access                    4                   13                   1
* ―Full‖ access meaning that each user has access to the web and e-mail from their
  individual desk; ―partial‖ access is anything less than this.

Two of the agencies had no paid staff at all, and two (different ones) had no computers
at all at the time of the survey, while four had only one computer. Of those with more
than one computer, only half had a network. Internet access was almost exclusively
by modem; two agencies had the superior ISDN; only one had access at an equivalent
speed to broadband — through a leased line.

5   These figures are for agencies that both participated in the Where IT's @ project at some stage and
    were included in the Lasa survey. Three of the original 20 (including Stratford Advice Arcade) were
    not in the Lasa survey, and two of those surveyed joined later. Those that joined and left had similar
    characteristics, so the data gives a good idea of where participants started from.
                                             5



This picture is a clear indication of the challenge that faced the Where IT's @ project
at its outset.


Subsequent developments: overview
One of the key features of this project is that even over a relatively short period of four
years it has had to respond to a considerable amount of change, some external, some
internal. These changes include:
    Participating agencies dropping out, for a variety of reasons.
    Changes within participating agencies, such as moving premises, changes in
      funding or activities, and significant staff changes.
    Funding changes, so that the package which could be offered to participating
      agencies had to be changed, almost from year to year.
    Technical developments, in particular the rapid increase in the general
      availability of broadband, so that something which was rare and technologically
      advanced at the beginning of the project is now commonplace.

The project has generally responded well to these changes, adapting over time, and
has also made pro-active changes in the light of experience. These developments have
included, for example:
    Replacing the agencies which dropped out, but using additional selection
      criteria, in the light of experience.
    Changing the supplier of technical support.
    Switching the connection method to ADSL as it became more easily available.
    Switching the benefits calculation package on offer in response to feedback.
    Continuous development of the training programme.

Developments during the life of the project are now discussed in more detail, drawing
on material provided by the SRU and collected from participating agencies as part of
this review. (See Appendix B for methodology and how the information was collected.)


Changes in participation
Nine of the agencies originally selected dropped out, for various reasons, between 2002
and 2005. The reasons were largely outside the control of the either the agency or the
SRU:
   Two quickly decided that the project was not relevant to them and withdrew.
   Four closed their advice service (or in one case the whole agency).
   One moved and was unable to obtain a broadband connection at its new
     premises.
   In two cases a change of manager led to the agency deciding it was no longer
     interested in Where IT's @.

Four of these nine agencies maintained their involvement with the project until March
2005, and it could be surmised that during their three years of participation they
obtained similar benefits to those agencies that stayed involved until 2006. In
                                                    6


retrospect it might have been useful to have included them in the data collection part
of this review, in order to confirm this.

Some of the agencies that dropped out were replaced, and a new tranche of funding in
2004 allowed an additional batch of agencies to be recruited, having been identified as
meeting the project criteria.

In total, therefore, 31 agencies have been involved with Where IT's @ over the four
year period, and only 11 of these have been involved throughout. This degree of
turbulence clearly has an impact on the capacity of the project, since it results in
different agencies being at different stages at any one time, and therefore requiring
different types of support. It also means that relationships between the project and
the agencies — and particularly with the manager responsible for the agency‘s
participation in Where IT's @ — have had to be built up from scratch each time a new
agency is recruited.

Given that the reasons for agencies dropping out were not, in the main, arbitrary,
there seems little the project could have done to reduce the turnover. It is hard to
imagine selection criteria which could predict that an agency will stop providing an
advice service a year — or three — hence. It is just a fact of life that the voluntary
sector is in constant flux as funding and social priorities change; it is a useful lesson,
however, that any project seeking to engage in long-term development work with a
small number of small voluntary agencies should plan for a significant level of
turbulence6.

The agencies which dropped out were — on the evidence available — smaller than the
average size of agency involved with Where IT's @. (This applied particularly to those
that dropped out for practical reasons rather than after a change of manager). It is
reasonable to assume that smaller agencies are likely to be more volatile, and this also
should be taken into account.


Changes within agencies
Many of the agencies that were participating in Where IT's @ underwent significant
changes during the course of the project. Of the 16 that were interviewed, four had
either moved premises during the Where IT's @ project, or were about to. One of these
had moved because it separated off from its original host organisation. One other
agency had merged to become part of a larger organisation.

There had also been many changes of staff at the participating agencies. Of the 16
managers interviewed for this review, five had joined the agency some time after it
joined Where IT's @ and knew very little about the setting-up process or early
experiences of involvement. A further five had been with the agency and involved to
some extent with Where IT's @, but not necessarily in the role of Chief Officer. The
remaining six had been involved as Chief Officer or in a senior management role in
establishing their agency‘s participation.

6   Conclusions in the text are summarised at the end of this report.
                                                    7



Changes in personnel can clearly be disruptive for those working with them on a
project such as Where IT's @. We have already seen that two agencies dropped out of
the project because of a change of leadership. Of those five still in the project, but
where the manager had arrived recently, only one rated the project as an unqualified
success at their agency. (See below for details, and comments on the SRU response.)

Office moves can also be disruptive, but good planning can mitigate this. While
several agencies did comment that it had required extra work to maintain their
involvement in the project through an office move, others stressed that participation
in the project had helped them to plan a smooth transition, and for some the new
premises were a big improvement, providing facilities that had not previously been
available.

Half of the agencies interviewed had changed significantly in size over the course of
the project (in addition to those merging or splitting). The majority (six) of these had
expanded, although two had contracted, owing to changes in their funding or client
base. These changes generally appeared less disruptive, although obviously they
bring with them a need to adjust the ICT infrastructure and to provide additional
training for new staff.


The package on offer from Where IT's @
In addition to the software itself, the Where IT's @ package included consultancy,
technical support and training.

The consultancy available to the original 20 agencies, during 2002, comprised:
   two free training courses on Managing ICT and on Financing ICT, delivered by
     Lasa. Participants represented 14 of the 20 agencies.
   a free consultancy from Lasa for managers, with the aim of producing an ICT
     strategy for their agency. About half of the agencies took up this offer. (Of those
     currently involved, four had a consultancy, six decided it was not needed.)
   advice on where to obtain funding for necessary improvements to agencies‘ ICT
     hardware and software. A few agencies obtained small grants from the
     Community Chest.

For those which joined subsequently — and those which did not take up the offer first
time round — a less-comprehensive package became available in 2005. (Part of the
reason it was less comprehensive was that the ChangeUp7 funding initiative was
proposing to offer training in ICT Strategy. This was in fact substantially delayed, but
was advertised by SRU when it became available.) In 2005, nine agencies received a
healthcheck and technical support to ensure their computers were networked and that
they were free of viruses and other technical problems. (These agencies included two
which had opted out the first time round and one of the original ones which received a
repeat visit.) In 2006 a further training programme on managing ICT was delivered.

7   A central government initiative to build voluntary sector capacity in a number of areas, including
    ICT.
                                                     8



Technical support consisted of four elements:
   The council‘s IT department arranged broadband installation in 2002 (discussed
     below). All agencies that joined in 2004 already — in theory at least — had
     broadband connections.
   Five of the original agencies needed their computers networked before they could
     be connected to the council server. The SRU arranged for this to be procured
     and carried out by Newham Training Network‘s ICT support officer, for the four
     agencies without their own ICT support.
   External suppliers (Lasa in 2002–03, then New Deal IT Services in 2003–04)
     were employed to set up agency computers so that they could connect to the
     Where IT‘s @ server and software.
   In January 2005 John Pipal, an independent ICT consultant, was employed to
     offer ICT consultancy and simple ICT support including installation of peer to
     peer networks for small agencies, configuration of PCs and resolution of issues
     such as lack of a firewall. John Pipal also delivered the 2006 Managing ICT
     training programme.

In several instances, agencies chose not to use the Where IT's @ consultant for
technical support, as they had their own existing support arrangements, either
through internal staff or external providers (in some cases independent arrangements
with John Pipal).

The training programme — offered free of charge — consisted of:
   Managing ICT and Financing ICT, provided by Lasa in 2002.
   Managing IT, provided by John Pipal in 2006.
   AIMS training, provided throughout by Lasa, who produce the software.
   Software training in the other products, provided by the SRU from 2004.
   Using the internet training, provided by the SRU from 2004.


Broadband issues
The decision at the outset to provide access to the software over a broadband SDSL
link led to some significant problems, with the result that at no time, in fact, were all
the participating agencies actually linked into the system. In a few cases, problems at
the agency end could not be resolved, despite the best efforts of the project team. For
many of those that did get a connection there was considerable delay between them
joining the project and being able to access the software.

These problems fell into two categories: slow responses from external suppliers, and
difficulties getting the technology to work, particularly in some parts of the borough or
some specific buildings.

Figures provided by the SRU indicate that:
   By March 2002, just after the start of the project, there were 18 member
     agencies8, 5 with broadband and none connected to the Where IT‘s server

8   These figures all exclude Stratford Advice Arcade, since its status as a participant was different.
                                              9


        By March 2003 there were 19 member agencies, 15 with broadband, 5 connected
         to the Where IT‘s @ server.
        By March 2004 there were 19 member agencies, all with broadband, 9 connected
         to the Where it‘s @ server.
        By March 2006 there are 19 member agencies, all with broadband except two, 15
         connected to the Where IT‘s @ server. (The two without broadband had had it
         when they joined the project, but lost the service due to subsequent financial or
         technical problems.)

Now that broadband availability can more or less be taken for granted — certainly in
urban areas — it is important to remember that this was not the case even four years
ago, at the start of Where IT's @. As an early adopter of the technology, the project
was inevitably taking a risk and these delays were the consequence.

In the light of this experience — and subsequent technological developments — it has
now been decided to adopt a different approach. Access to the CPAG Handbook (from
September 2006) and AdviserNet (at a later date) will now be provided over the
internet, using the ADSL or other standard broadband connections which virtually all
agencies now have. This will result in lower use of the council servers, and they will
be phased out. Before that, all agencies will have AIMS and the Lisson Grove package
installed on their own system if they wish to continue using these packages. (Those
agencies with client-server networks have always been able to have this option.)


3.       Where IT’s @ evaluation criteria & methodology
The top-level evaluation criteria proposed for this review included looking at:
   benefits to the agencies using the system.
   benefits to service users.
   better information about service users and their needs.

In addition, the review aimed to look in detail at how the project has worked, what has
worked well and what could have been done differently. This includes:
    features which made the project work well for organisations and their staff.
    features which caused problems for organisations and their staff.
    features which directly affect service users, for good or ill.
    how the project has been introduced to organisations, and what changes this has
     brought about.
    how the project has been run, technically and managerially.
    the financial implications for organisations of participating in the project.


Methodology of the review exercise
This review is based on three main data-gathering exercises: a set of documentation
provided by the SRU setting out the history of the project and other key pieces of
information; detailed telephone interviews with managers at 16 of the agencies
currently involved with the project; and a detailed questionnaire survey of staff
working at those projects. See Appendix B for more details of how the review was
conducted.
                                          10



In the discussion that follows, boxed material in italics is quoted directly from the
manager interviews. Statistics from the questionnaires are introduced with the exact
question asked.

Each of the main elements of the project — software, consultancy and technical
support, broadband provision and training — is now looked at in turn, followed by an
overall assessment of the project.


4.   The software
The core benefit envisaged by the Where IT's @ project was from the software being
provided. The broadband access, technical support and training — although they did
have benefits in their own right — were all essentially designed to support the
effective availability and use of the software.

One agency which has been nominally part of Where IT‘s @ from the outset has never
used any of the software because of continuing ICT problems, in particular with their
network. They now expect that the difficulties will be resolved soon, and are keen to
use the products available. Where data from the interviews is used below, this relates
to the 15 agencies which have had the opportunity to use the software.


AIMS
AIMS is a case-recording system produced by Lasa specifically for the voluntary advice
sector. It is designed both to record casework and to make it easy to generate
statistics about cases and clients.

Of the 15 relevant agencies interviewed, four had opted not to use AIMS, as they felt it
was not appropriate for them. In one case this was because they don‘t give advice and
therefore found it less relevant, in two agencies they already had a suitable case
recording system, while in one case the interviewee was unclear what the reason had
been. One agency which opted not to use AIMS now uses it and is enthusiastic, while
another started using it but no longer does so.

We did use AIMS at the beginning [and] spent a lot of time inputting records. But
because we do advocacy rather than advice we didn’t find it useful really [and now
have a system we designed ourselves in Excel]. AIMS was like a strait-jacket and it is a
relief not to use it any more.

Four agencies have had mixed results. One was using AIMS when they were based at
another participating agency, whose administrative staff entered the data for them.
Having recently moved to separate premises, their use of AIMS is currently in
abeyance but they intend to restart. The other three have used AIMS but not as much
as they might, in one case because clients are unwilling to have their names recorded.
                                            11


We only use AIMS for our main cases, not every enquiry. For short visits and phone
calls we keep a tally on paper. AIMS is definitely worth having; as a manager I would
like more information. Our staff concentrate on people not figures, [but I want to
encourage them to enter more data].

For the remaining six organisations AIMS has been a triumph. It is in regular use
and is producing the anticipated benefits in terms of better recording and better
management information.

AIMS … has improved our recording and given us more information. [We have]
improved more than just the software. We think more about data capture. It has been
a struggle and we do still see rubbish data in reports [but] we explore how to do it
better. That’s what monitoring is about.

AIMS is very, very useful. It gives an up to date daily record of where the organisation
is at. We were doing it, but AIMS is far more efficient.

AIMS is now a core package for the organisation.

The potential benefits to managers from the information that AIMS can deliver have
been realised in most, but not all, of the ten agencies that are using it. Four of the ten
reported that it had not noticeably made a difference to the quality of information
available, one felt that it could help, but only if the agency became more rigorous
about entering the data, and one had not been using it for long enough to tell (but was
expecting it to bring benefits, having seen it in action elsewhere). The remaining four
were enthusiastic.

AIMS has absolutely given managers a better picture. We now know when the figures
are wrong. We can spot anomalies and correct them, and are beginning to be able to see
trends.

We use statistics in funding bids and in targeting service delivery — for example we
noticed take-up of a particular service was poor among elderly black people and took
steps to address that.

Using the statistics helps us to fundraise effectively. We can show, for example, that
20% – 30% of our work is immigration-related.

AIMS is more efficient and we can manipulate the figures better.

Comments from users in the questionnaire survey bear out this picture. Of the 20
respondents, 14 had used AIMS.

They were generally frequent users, supporting the conclusion that AIMS is a core tool
in organisations that are using it:
                                                                              12


How much do you normally use AIMS?
     Every day .................... 5          Several times a week ....... 5            Several times a month... 2               Less often ............2


AIMS is most commonly used directly by advice workers, although one third of
respondents enter data on behalf of colleagues. Half (7) both enter data and then
make use of it themselves, but almost as many (6) enter data for use by others in the
agency. This illustrates that decisions have to be made by individual agencies on
whether advisers should do their own data entry or whether it should be done by
administrative staff.

What do you use AIMS for? (Tick all that apply)
     Entering data about your own clients ............................... ..................................... 10
     Entering data about other colleagues’ clients...................     ....................................... 5
     Using data about individual clients (to send letters, for example) .................................. 5
     Using data about groups of clients (to produce statistics, for example) ......................... 5
     To obtain lists of open files for file review purposes .........     ....................................... 1


All the respondents find AIMS useful, and an improvement on previous systems for
the 10 respondents who had used another system:

How useful is AIMS?
     Very useful ................. 10          Quite useful ....................... 4    Not very useful ............... 0        Not useful at all ....0


How does AIMS compare with the case recording system you had before?
     A big improvement .... 8                 A bit of an improvement ....... 2          No different really ........ 0          Worse than before ..0


As with the managers, for almost everyone AIMS provides better information:

To what extent does AIMS provide you with better information about your service users and their
needs?
     A lot .............................. 7    Quite a lot.......................... 6   Some .............................. 0    Not much .............1


To what extent has AIMS improved your organisation’s ability to collate information about its work
and influence long-term planning of services?
     A lot .............................. 6    Quite a lot.......................... 6   Some .............................. 2    Not much .............0


The AIMS software is generally easy to use, and there were only two specific
suggestions for improvements:

How easy is it to use the AIMS software?
     Very easy ..................... 6         Quite easy ......................... 7    Not very easy ................. 1        Not easy at all ......0


How could the AIMS software be improved?
     There are some restrictions on how you can adapt AIMS for your particular organisation’s needs. For example
        you can only add six user defined fields to the subject details page. More flexibility in how you can adapt
        AIMS for your organisation would be useful.
     Could be more user friendly; people generally seem to make mistakes entering data; using the system can be
        confusing.


On balance, therefore, AIMS has shown its worth. For many agencies it has been very
useful, even transformational, and a majority of participating agencies have made
some use of it. In most of those that have not used it, the reasons are nothing to do
                                                                        13


with the principle of case recording or the quality of the software, but more to do with
the specific situation of the individual agency.


Lisson Grove benefits software
The Lisson Grove software was introduced to the Where IT's @ project in 2004 in place
of the Maximiser product because of poor feedback and a specific complaint about an
agency giving the wrong advice on the basis of a wrong calculation.

The Lisson Grove product is a well-established benefits calculation package. It is,
obviously, only relevant to those agencies whose advice work includes calculating or
advising on benefits. Five agencies make no use of the Lisson Grove software at all, in
most cases because they don‘t give benefits advice.

A further three make limited use of it, in one case giving as the reason that it doesn‘t
cover the main benefits they advise on. The remaining seven do use the package, find
it helpful or very helpful, and find that their staff like it and find it easy to use (in one
case easier than a previous equivalent).

However, in the interviews it did not elicit the quite same level of enthusiasm as
AIMS. This is almost certainly because it is a niche product which is ideal for
handling complex benefit situations; where an agency is giving advice on the process
of qualifying for and obtaining a benefit with which the workers are familiar there is
often no need to carry out detailed calculations. This software, unlike AIMS, is
therefore less likely to become central to the agency‘s work or lead to significant shifts
in its way of working.

The Lisson Grove software is very good. We were doing [benefits calculations], but less
well, and less reliably.

Comments from the user questionnaires again bear out the managers‘ views. This
time, only seven of the 20 questionnaire respondents had used the product at all, and
less often than AIMS. All of its key functions are used, but for detailed calculations
more than the quick option:

How much do you normally use the Lisson Grove software?
     Every day .................... 0    Several times a week ....... 2           Several times a month... 2          Less often ............3

What do you use the Lisson Grove software for? (Tick all that apply)
     Benefits calculations .... 6          Tax credit calculations .... 5         Quick benefits calculations ... 3


All the respondents find the Lisson Grove software useful, and most find it a big
improvement on alternative benefits calculation methods:

How useful is the Lisson Grove software?
     Very useful ................... 3   Quite useful ....................... 4   Not very useful ............... 0   Not useful at all ....0
                                                                         14


How does the Lisson Grove software compare with the way you used to calculate benefits (either
manually or using a different program)?
     A big improvement .... 5            A bit of an improvement ....... 2         No different really ........ 0      Worse than before ..0


The Lisson Grove software is generally easy to use, and although there were two
answers about improvements, these contain no specific suggestions:

How easy is it to use the Lisson Grove software?
     Very easy ..................... 1    Quite easy ......................... 6   Not very easy ................. 0    Not easy at all ......0


How could the Lisson Grove software be improved?
     By making it very easy and understandable.
     No suggestions. The Lisson Grove software is the least used by us as benefits is not a major part of our work.


There is no doubt that the Lisson Grove software performs a useful function for those
agencies whose service involves assessing a client‘s likely benefits or checking that
they are receiving the benefits they are entitled to. It fulfils this specific role well.


CPAG handbook
The electronic version of the CPAG handbook aims to make information about benefits
and guidance for advisers more readily available than the long-standing and highly-
respected printed handbooks. In addition to guidance on the law from the handbook
itself, it contains annotated legislation and Commissioners‘ Decisions, all of which can
assist the handling of a case. As well as being searchable in a greater variety of ways,
with better cross-referencing, an electronic version has the key benefit of being more
easily updated — a crucial feature in giving accurate and reliable advice.

The responses to the handbook were fairly neutral from most agencies. Three of the
15 interviewees said that their agency doesn‘t use the electronic version, and in two
cases prefer using books or have gone back to the books. Three agencies said that they
sometimes use the electronic version, or that some people use the electronic version
while others prefer books. One of these said that the product has done what it is
supposed to, and welcomed the regular upgrading.

Of the remaining eight, whose agencies all do use the electronic handbook, one said it
was very useful, one uses it a lot, and two others particularly mentioned the value of
access to Commissioners‘ Decisions.

Eight of the questionnaire respondents use the electronic version of the CPAG
handbook, but on the whole not very often:

How much do you normally use the CPAG CD-ROM?
     Every day .................... 0     Several times a week ....... 1           Several times a month... 2           Less often ............5


They all find it useful, and mostly an improvement on the printed version:

How useful is the CPAG CD-ROM?
     Very useful ................... 3    Quite useful ....................... 5   Not very useful ............... 0    Not useful at all ....0
                                                                         15


How does the CPAG CD-ROM compare with the printed handbooks?
     A big improvement .... 3            A bit of an improvement .... 4            No different really ... 0     Worse than the handbook . 1


One reason for the relatively low use of the electronic CPAG information may well be
that even those who use it do not find it particularly easy:

How easy is it to use the CPAG CD-ROM?
     Very easy ..................... 0    Quite easy ......................... 6     Not very easy ................. 2   Not easy at all ......0


There were no specific suggestions for how the product could be improved.

Use of the electronic version of the CPAG materials seems to be largely a matter of
personal preference, and for most agencies it appears to be the least valuable of the
four products on offer through Where IT's @ (but with a specific role to play in those
that need access to Commissioners‘ Decisions). Although it could, in theory, be more
up to date than paper versions, none of the agencies said that they required their staff
to use it for this reason. Nonetheless, most agencies are using it to some extent, and
in some cases finding it particularly valuable.

One of the smaller agencies commented on the cost of buying (and continually
replacing) the printed CPAG Handbooks. It should not be overlooked that Where IT‘s
@ gave access to this valuable resource not only electronically, but also for free.


AdviserNet
The information system provided by Citizens‘ Advice for its member Citizens‘ Advice
Bureaux has long been regarded as authoritative but the paper version was always
unwieldy — especially since the frequent updates used to have to be made manually.
The electronic version has obvious advantages. As with the CPAG materials, giving
free access to this resource is a significant financial benefit, especially to the smaller
agencies.

Almost all the interviewees gave the very similar responses in relation to AdviserNet
as to the CPAG Handbook, reinforcing the idea that it is attitudes to seeking
information electronically which determine people‘s enthusiasm, rather than the
specific product.

There was a slight preference for AdviserNet. Two respondents did say that it was
particularly good, or that they used it more than the CPAG Handbook. While there
were few ringing endorsements of AdviserNet, it is perhaps an even better outcome
that it seems to have become an unobtrusive, accepted component of the resources
available to advisers in almost all the participating agencies.

The picture from the questionnaires about AdviserNet is similar to that for the CPAG
information, with, again, AdviserNet having a slight edge. Nine respondents had used
it, some fairly often:

How much do you normally use AdviserNet?
     Every day .................... 0     Several times a week ....... 2             Several times a month... 4          Less often ............3
                                                                          16



They all find it useful, and the six who have experience of the paper version find the
electronic option an improvement:

How useful is AdviserNet?
      Very useful ................... 5    Quite useful ....................... 4     Not very useful ............... 0   Not useful at all ....0


How does AdviserNet compare with the paper version of the information system?
      A big improvement .... 4            A bit of an improvement .... 2            No different really ... 0     Worse than the paper ....... 0


AdviserNet clearly does better than the CPAG information on usability, with only one
suggestion for improvement:

How easy is it to find what you are looking for in AdviserNet?
      Very easy ..................... 1    Quite easy ......................... 8     Not very easy ................. 0   Not easy at all ......0


How could AdviserNet be improved?
      Allow paragraph numbers of each category to be entered as part of a search.



5.    Consultancy and technical support
Unsurprisingly, given the proportion of interviewees who had not been closely
involved in the early stages of the project, over half were unable to comment directly
on how much they had been helped by the initial health check and consultancy (or
initial support, in the case of those which joined at a time when consultancy wasn‘t
specifically available).

However, for all seven of those who were able to comment, the experience had been
extremely positive, and three of these gave specific examples of the effect it had had.

The initial consultancy was very helpful. It prompted us to adopt a proper strategy and
to build ICT into our business plan. It was timely and we wanted to make the change.

In 2002 we were not even on the internet. Because Where IT’s @ told us that the
software needed better computers, the management committee took ICT more seriously
and we now lease better computers which are upgraded every two years.

All the staff went to the health check sessions and we took the decision to set aside
money for replacing our computers. It helped to focus our thinking.

The main purpose of the consultancy and technical support was to ensure that
agencies had the internal infrastructure necessary to give all their relevant staff
access to the broadband connection, and hence to the Where IT's @ software. In the
event, it was the broadband link that caused the most trouble (see below).
                                                                             17


Meanwhile the internal developments were generally valuable in their own right, and
the change can be seen by comparing figures for networking at the start of the project
in 2002 (discussed above) and at the time of this report in 20069:

                                     No network                Peer-to-peer network                      Client-server network
                  2002                       6                               4                                2 (1 not working)
                  2006                       0                       8 (3 not working)                     9 (2 with some issues)


(These figures exclude those agencies with only one computer, for whom a network
would not be relevant.)

Firstly, this reflects the fact that the number of computers had increased. By 2006
almost no agencies were still operating with just one computer. In some the increase
had been dramatic: from six to 12, for example, from one to seven, or from four to 10.
In most cases, by 2006 the number of computers in use more or less matched the
number of paid staff.

In almost all cases the type of network appeared appropriate for the number of
computers10. All those without working peer-to-peer networks were on the borderline
of needing one, all reported to have only two computers each.

The interviews with managers explored a range of factors which indicate whether an
organisation is using ICT appropriately. These include not just technical features but
also the equally important policies, procedures, budgets and other factors which are
essential for effective use of ICT.

The variation between agencies makes comparative analysis difficult, but about two
thirds of the agencies came out well on most of the measures that applied to them.
This is encouraging, because it suggests that they have recognised the importance of
ICT, approached it in a systematic way, and spent some time, effort and money in
managing it well.

A similar picture emerges from the responses from users in the questionnaires.
Although some are not happy with their organisation‘s ICT facilities, a majority feel
that their organisation‘s provision of ICT in general is good or excellent, and two
thirds are generally happy with the ICT available to them personally:

How good do you think your organisation’s provision of ICT is, in general? (For example: are there
enough computers? are they up to date? do they have good software? how does it compare with
other organisations you may have worked in?)
        Excellent ...................... 5   Good ................................. 6   Average .......................... 7   Not good at all .....1




9    With the caveat that the list of agencies had changed between the two dates, and that in a few cases
     an educated guess has been made, as respondents were not clear what kind of network they had.
10   A peer-to-peer network is usually recommended only for five or fewer computers. Above 10 a client-
     server network is almost essential.
                                                                           18


How suitable is the ICT available to you for doing your work? (For example: do you have the use of a
computer whenever you need it? is your computer fast enough to do the work? is everything set up
to ensure that you avoid health and safety problems? if you have a disability, is the hardware or
software adapted to enable you to work it properly?)
     Ideal ............................. 3   Mostly fine ....................... 10   Not good enough ............ 5    Not good at all .....1


They also feel that things have improved:

How much better is your organisation using ICT now than it was five years ago (or when you started
working there)?
     A lot ............. 8     Quite a lot .........3      Some................ 1     Not much .......... 1   Don’t know ........ 3


Obviously, some of the changes and improvements would have come about without the
Where IT's @ project, but some credit, especially in the case of the smaller agencies,
must go to the training, consultancy and other resources made available through
Where IT's @.

One specific resource provided by Where IT's @ was technical support. Very little data
was collected about the initial providers of this, but many of the managers interviewed
were pleased with the service they are currently getting, provided by John Pipal.
Unfortunately for some, the project only ever intended him to undertake certain tasks
— ensuring that agencies can use the Where IT's @ software — not to provide a
general free support service. Some of the agencies were disappointed by this, but
several of the participating agencies were using him on a commercial basis for
additional support; budgeting for this was something encouraged by the Where IT's @
project.

Only three managers had no experience of using John Pipal‘s support. The remainder
had found the support useful, with reservations in a few cases — mainly because he
was unable to help them with matters unrelated to Where IT's @. Several commented
on these limits to the support available. Others mentioned internal support
arrangements which meant that their need for support was reduced, but many of
these had still welcomed relevant external help.

Occasionally we use John Pipal or John Defoe [from the council] if specific to Where
IT's @; otherwise we use our own people.

The quality of the support was highly rated, and many managers commented on John
Pipal‘s approach.

We use John Pipal independently and are very impressed.

We pay John to come in ourselves. We have him as our own IT consultant. He’s
fantastic, explains well, is good and responsive. He works well with our designated IT
person.

He’s good, explains things well and is very friendly
                                                                       19


Given the effect of the broadband problems (discussed in the following section), the
project would clearly have suffered if internal networking problems had also
hampered the ability of agencies to make use of the software. That this didn‘t happen,
and that most agencies are pleased with the support they are getting, must be seen as
a positive aspect of the project.

The impact of the Where IT's @ project on the staff must not be overlooked. Almost all
said that they are now pretty confident about using computers, and a large majority
that they are more confident than three years ago. Not all of this may be attributable
to Where IT's @, but the project is likely to have at least made a contribution.

How confident do you feel about using computers?
     Expert .......................... 2   Pretty good..................... 17   A bit nervous ................. 1   Very nervous .......0

How has your computer confidence changed over the last three years?
     Much more confident ... 10              A bit more confident ...... 8       No real change ............... 2



6.   Broadband provision
Technical communication problems have dogged the project throughout. Detailed
internal documents on the sequence of events have been provided by the SRU.

The technical solution devised at the outset was for all the software being provided
through Where IT's @ to run on the council server, and for participating agencies to be
given remote access to this. The main benefit of this approach is that the council could
then manage the software centrally and, in particular, take responsibility for keeping
it up to date and for backing up the agencies‘ data. It is, of course, essential for advice
work to be based on completely correct and up to date information; the small size and
lack of ICT expertise in many participating agencies would suggest that they may
have struggled to manage reliably the updating and back-up processes for themselves.

The consequence of this decision, however, was that:
   Remote access had to be provided for all participating agencies to the council
     server.
   The access had to be very reliable.
   The system had to deliver information to the user without delays that impeded
     the work or discouraged its use.
   The system had to be secure, so that neither agencies nor the council could see
     each other‘s data; equally, it must not allow access to internal data via the
     internet.

Based on previous council experience, it was decided to provide access to the server
through SDSL, and to deliver the software through Citrix. SDSL is a commercial-
strength broadband technology which, in principle, provides a relatively cheap, fast,
―always-on‖ connection. Citrix is a ―thin-client‖ technology which, among other things,
reduces the amount of information that needs to be transmitted, and therefore speeds
up the delivery of information.
                                            20


Each agency would need a router installed on their site, connected to their network, to
provide the SDSL access and handle password protection and other security measures.
The system as a whole would connect to the internet through a firewall, another
essential security provision.

All of this technology is mainstream in the commercial world, although much of it is
more complex than that used in most small and even medium-sized voluntary
organisations. It requires considerable technical expertise to set up and maintain.

In the event, when the installation began in 2002 it did not go smoothly. There were
technical problems in getting the SDSL links to work properly both at the council end
and at several of the participating agencies‘ sites — especially where the agency was
participating not only in Where IT's @ but also in other networking arrangements.
Technical solutions were, in most cases, eventually found, but most of the issues were
challenging and took considerable time and effort to resolve. Additionally there were
delays at other sites, caused by slow delivery by suppliers; these were wholly outside
the control of the project.

In addition, the initial set-up of the Where IT's @ network didn‘t work because other
traffic on the council system interfered with the Where IT's @ traffic; this was resolved
quickly. It also took until January 2003 before the Citrix system was being correctly
backed up, as the original approach had been found to be unreliable.

The result was that over a year after the project officially started only a quarter of the
agencies were connected to the server, and therefore able to use the Where IT's @
software — although preparatory work and training was being provided during that
time — and at no point were all those nominally participating actually able to connect
to the service.

By 2004 it was clear that it was becoming reasonable and practical to expect agencies
to have their own standard ADSL connections, although these were generally slightly
slower than SDSL. For the 2004 batch of participants, an existing ADSL connection
became a condition of joining the project.

This brief history shows that the system fell short in one of its four key areas, and
slightly short in another: remote access was not provided for all agencies, and there
were initial problems with reliability. However, once the problems were eventually
solved they generally stayed solved, enabling agencies to participate in the project
fully and reliably. There do not appear to have been any problems with the other two
areas: speed or security.

Several managers mentioned in their interviews that there had been delays or
connection difficulties which had hampered their participation in the project, and in
the questionnaires, more than half reported having had at least occasional access
problems. Unfortunately, from the responses it is hard to tell exactly what all the
problems were, but certainly in more cases it was some aspect of the connection
between the user and the council server than a training issue or fault with the
software itself.
                                                           21




                                                                                   Lisson
                                                                  AIMS                         CPAG        AdviserNet
                                                                                   Grove
Users reporting access problems/total responses                   10/14             4/7            4/8          5/9
What the problem was:
Someone else was using the system                                   1          2               0          1
I hadn’t been trained on the system and couldn’t make it            0          0               1          0
work
I had been trained but couldn’t remember how to make it             1          0               1          0
work
The software itself didn’t work properly                            1          0               0          1
Something technical but I don’t know exactly what                   7          2               2          3
My organisation’s computer system wasn’t working                    3          0               0          0
My organisation’s internet connection wasn’t working                0          1               0          1
The server at London Borough of Newham wasn’t working               2          3               1          2


When asked in the questionnaires what was the worst aspect of the Where IT's @
project, only five specific comments were given (plus two who made a point of saying
―none‖). Of these comments, four related to delays in getting the connections working,
or unreliable access. (The other mentioned insufficient training.)

What has been the worst thing about the Where IT's @ project?
      If the system goes down I am unable to complete my work.
      Technical problems accessing the information, delays in the installation of relevant software, which meant that
           advisers had to wait months after attending the training before the service was actually available.
      A lot of hold ups early on.
      Too many hold ups at the start.


Clearly, these delays and technical communication problems have been unsatisfactory,
and must have reduced the full impact of the project for some of early agencies. In the
long run, however, broadband has been an asset to all the participating agencies.


7.    The training programme
Most of the training courses were evaluated at the time by the organisers, and the
results have been incorporated into this review. In addition, questions were asked
about the training in both the manager interviews and the questionnaires.

The managers were not asked to differentiate between the various courses. Their
overall assessment was generally positive. Four were really unable to comment as
they had not been involved at the time.

Of those that commented, three had found the Training helpful, but felt that there
were outstanding problems that it had not resolved.

Not everything was useful. I know more now but am still struggling.

The remaining nine had found it useful or helpful. Two mentioned that they had sent
a manager on the training, who was now passing the training on internally. Several
mentioned a general benefit to confidence, in addition to the specific skills covered.
                                           22


We took up the offer of training in spades. It was very useful and has had a general
spin-off on IT confidence as people had to engage with computers.

Our manager did the training and is now training volunteers. It was very useful.

We are enthusiastic about the training. It improved people’s computer literacy across
the spectrum.

We went on the training because it was free. We came away knowing what we’re doing.
Even those who already knew before the training became more confident.

The information available about specific courses indicates that all appear to have been
at least satisfactory and most — in particular those run by the SRU itself — generally
good to excellent.


Managing ICT and Financing ICT
Fourteen agencies attended the training in 2002, which was provided by Lasa and the
satisfaction ratings were as follows:

                                 Managing IT        Financing IT
                     Excellent       32%                16%
                      Good           50%                42%
                       Fair          18%                42%


A further course on Managing ICT was devised and delivered by John Pipal in April
and May 2006. This is aimed at agencies which are too small to have an ICT support
officer or which do not have sufficient funding to buy in ICT support from a
consultant.

Only one questionnaire respondent had attended the initial training on Managing
ICT, rating it ―Good‖.


Here’s Where IT’s @ training programme
The SRU developed and delivered this training programme which started in March
2004. It covered software whose providers did not offer separate training: the CPAG
information, AdviserNet and the Lisson Grove software. In addition the SRU
delivered a ‗Using the Internet‘ course, showing advisers how to access free
information on the internet. This was then offered in the SRU training brochure to
advisers outside the membership of the Where IT‘s @ project.

Forty-four evaluation forms from twelve ‗Using the internet courses‘, run between
March 2004 and June 2005 have been returned, along with 20 from nine courses on
using advice software, run between October 2004 and June 2005. The results are very
impressive:
                                   Using the         Using advice
                                    internet           software
                                                                            23


                                 Excellent                         70%                               70%
                                   Good                            30%                               25%
                                Satisfactory                                                          5%


On the whole, the lower scores were for the earlier courses, suggesting that the team
had learned from any initial difficulties and made improvements.

Twelve questionnaire respondents had attended ‗Using the internet‘ and seven had
also attended the advice software training. Figures for the two courses were not
collected separately. Although the scores are not quite as good as those collected on
the day, they are still very satisfying, and suggest that the courses made a lasting
difference:

How good was the Here's Where IT's @ training?
     Excellent ...................... 3     Good ................................. 8   Average .......................... 1   Not good at all .....0

How much better could you use the internet for your advice work afterwards?
     A lot better...................... 5     A bit better ..................... 5     No real difference ........... 2



AIMS training
All agencies who opted to have the AIMS software were offered training from Lasa, at
introductory and specialised levels. Initial advice from Lasa was that places on their
training courses should be targeted on agency administrators and managers, but the
SRU felt that this underestimated agencies‘ training needs. Agencies were then
offered training for all their staff on AIMS. The training courses provided by Lasa
apparently received good feedback at the time.

A total of eight questionnaire respondents had attended at least one AIMS training
course. This, again, was rated highly, both for quality and for its effect.

How good was the AIMS training?
     Excellent ...................... 3     Good ................................. 5   Average .......................... 0   Not good at all .....0

How much better could you do your work (those parts that relate to AIMS) after the AIMS training?
     A lot better...................... 5     A bit better ..................... 2     No real difference ........... 1



Lisson Grove Quick Benefit Calculator course
Experienced advisers from participating agencies were invited to attend this course,
on the basis of a training need assessment during the Here‘s Where IT‘s @ course.

Eight questionnaire respondents gave feedback on this course which, again, appears to
have been well received and to have had the desired effect:

How good was the Lisson Grove training?
     Excellent ...................... 1     Good ................................. 7   Average .......................... 0   Not good at all .....0

How much better could you do your benefits calculations after the Lisson Grove training?
     A lot better...................... 5     A bit better ..................... 3     No real difference ........... 0
                                                24


8.     Project administration
The Where IT's @ team at the SRU stayed in touch with the participating agencies by
phone and through newsletters, through an annual visit to each agency, and through
meetings to which all project participants were invited.

This support, and the level of contact with the participating agencies, was commented
on very favourably by several managers in their interviews:

Celia11 has been good at coming in to check how it’s going. She’s very good at taking
problems and dealing with them.

Celia came a number of times and talked to staff about the possibilities. When it
changed and was upgraded she came again.

The SRU appears also to have responded to lessons learned during the project, and
made appropriate changes, both to what was on offer and the way it was delivered. In
the case of most the agencies, the SRU staff obviously have had good personal
contacts, know the individual situation of the agency, and understand some of their
aspirations as well as the pressures they are under.

At the same time, it has become evident during this review that the SRU has not
always been informed by some agencies when they hit problems or took decisions to
reduce their use of the software. The SRU has also found that agencies did not always
respond to invitations to general meetings or offers of review visits, and some of the
agencies have been much less keen than others to participate in this current review
exercise. This may well reflect the priority Where IT's @ has within the agencies.
Those that do not perceive it as core to their activities may not appreciate the need to
report back or seek the support that would have been available.

One lesson from this is perhaps that there is a difficult balance to strike between
keeping in active and regular contact, and monitoring to the extent that it becomes
intrusive. Sometimes there may be no option other than to wait to be told about
changes or problems.


9.     Overall impact
Both the managers and the questionnaire respondents were asked for an overall
assessment of the impact of Where IT's @. Managers were asked whether AIMS had
enabled them to ―do things better‖ and to ―do better things‖. Many replied positively
to both questions.

In terms of ―doing things better‖, three made the point that, with the advice Quality
Mark they were already delivering a good quality of service.



11   Celia Minoughan, Assistant Unit Manager, Social Regeneration Unit, LB Newham, who has
     managed the Where IT's @ project throughout.
                                            25


Three felt that the project had really made no difference to their service, and three
were unable to point to improvements because the system was not fully operational,
although two were anticipating benefits in the future.

Four particularly mentioned staff development as a positive outcome, while seven
could point to specific improvements in the quality or speed of their work.

Quality was always good. Now it is delivered more efficiently, to more people.

Accuracy of data and staff IT literacy have increased. It has helped with quality of
advice, but we would have done that anyway as we need the Quality Mark for our
funding.

It has helped with staff development, and it nudges us into thinking about
management and service delivery.

When it came to ―doing better things‖ seven managers did not think the project had
expanded their agency‘s horizons, and one did not know.

It has not really made a difference, but we would have had to work harder without it.

For the other eight, the main benefit had been in enabling the agency to work faster,
and/or to work with more people. Three, however, mentioned aspects of the
organisation‘s operations which the project had changed for the better.

We can act faster on clients’ behalf through electronic communication with the council.

The case-load used to be heavier before the Housing Benefit system changed, but now
[because of Where IT's @] we [have been able to diversify and extend our service and]
deal with a wider range of enquiries.

Now we are able to make much bigger step changes. Without [the facilities provided by]
AIMS a recent project [at this agency] would have struggled.

We use the statistics to compare with local demographic data and then target services

We went for Quality mark status at the same time. Where IT's @ has helped up our
game in relation to policies and procedures, as part of a raft of initiatives that have
helped ensure the organisation runs smoothly.

Managers were also asked about the financial impact of their increased use of ICT.
Most felt that the change had been negligible or positive. Two stressed that they had
no money to spend in any case, even though one could now cost the ICT better. Others
felt that the expenditure they had made was worthwhile:
                                                                     26


We came out on top. We are now much more professional, with better policies and
procedures, and it has made the tendering process much easier. There has been no
drain on reserves up to now. We will fundraise for the money we need for upgrading
our equipment.

We spent money by rearranging the budget, but it was worth it. Where IT's @
highlighted something we needed.

We have spent from our reserves, but it was money well spent. I don’t lose any sleep
over it. Our investment is more efficient and we will put money aside in future [to keep
the system up to scratch].

We have not spent money we wouldn’t have. By being able to take on more cases we
bring income into the borough, which helps justify our existence.

This generally positive view was shared by the questionnaire respondents.

How much benefit has the project been to your organisation?
     A lot ............. 5   Quite a lot .........5   Some................ 3   Not much .......... 0   Don’t know ........ 6


What is less clear is the impact on service users:

How much benefit has the project been to your service users?
     A lot ............. 6   Quite a lot .........4   Some................ 2   Not much .......... 2   Don’t know ........ 5


Most of the managers felt that their organisation would have made some progress
even without the Where IT's @ project, but many welcomed it nonetheless, and for
some it had provided crucial help.

Its been invaluable. Paying for support would not be an option as we don’t have the
money.

It would have been very difficult. We were at a standstill for many years. We have been
enabled to grow and employ staff because of it.

It would have taken longer to get to our now confident and independent IT status. [The
input from the project] focused us on what we were doing in IT and enabled us to pick a
sensible and supported way through the minefield.


10. Conclusions
The Where IT's @ project consisted of a number of components, which have been
analysed individually above. It is clear that the package was well thought-out and
that most of the components worked well:
   the software was appropriate, relevant and effective;
   the consultancy and support was appropriate and effective, even if it did not
      meet the full aspirations of some agencies;
                                            27


     the training was of good quality and effective;
     the project management was well-regarded by participants.

The only element of the project which does not appear to have worked as well as it
might is the decision to depend on broadband connections which then could not be
delivered consistently and in good time. With hindsight, using SDSL to connect to the
council server was probably the wrong approach; whether the decision was so
obviously wrong at the time it was made is less certain, given that the rapid
deployment of commercially-available ADSL was not a foregone conclusion. Four
years down the line, with ADSL a straightforward option for all agencies, the
opportunity is being taken to remove the complexity SDSL brought to the project, but
the memory of the early delays is still vivid in many of the agencies.

There are lessons to be drawn from the project‘s experience of putting the software on
a central server, especially in light of the growing trend for applications to be offered
over the internet. The central provision of software allowed the many agencies with
old, low specification computers to participate in the project without having to invest
in new equipment. It also meant that the council could acquire the software
economically, and could supervise the all-important updating and backing up, rather
than relying on there being sufficient commitment and technical ability in each
agency. On the other hand, it meant that the broadband link became essential to all
four software packages, and thus to the whole project. Where the agency was relying
on the software being available at all times, any unreliability in the broadband
connection led to serious frustration.

Did the package as a whole work? Different agencies have made different uses of
Where IT's @, and because they all started from a different place, there is no single
answer to the overall effect of the project. What is clear is that for some agencies it
worked spectacularly well. Several managers, in their interviews, made very specific
positive endorsements of the project:

It really worked for us.

We are absolutely positive about the project. It is the type of capacity-building that
works.

It is the best thing Newham council has done for us. It gave us a big boost and enabled
us to grow.

Support from the project was important and has more than a cash value.

We would have been limited because of the cost. We would have wanted to but could
not afford it.

We would have got there, but not as thoroughly and not as well.
                                           28


It made all the difference. Without Where IT's @ we would still just be word processing
on old equipment.

For these agencies all the core evaluation criteria were met:
   benefits to the agencies using the system.
   benefits to service users.
   better information about service users and their needs.

In a few other agencies it has perhaps not hit every target, but has nevertheless been
well worthwhile. This assessment is based on the agency using at least some of the
software effectively and running its ICT well, as well as perceiving that Where IT's @
has in some significant way contributed to this state of affairs. The number of
organisations where it has worked very well or at least well enough is about a half of
those currently involved in the project, or a third of those that have been involved
overall at some point.

That is not to say that the project had no impact on the remaining organisations. In
many cases it has made some changes, but piecemeal rather than transformational.
In addition, the project is still continuing, and some of those which have not felt the
benefits yet are still anticipating doing so. There are probably only five organisations
currently involved where the project appears to have made little impact and to be
unlikely to do so.

The organisations where it has worked very well include both small and medium-sized
ones; it is not the size of the organisation that determines whether this type of input
will be successful. On the evidence of this review, probably the most important factor
in success is the presence of a manager who sees the value of developing ICT and is
prepared to commit the organisation to it. We have already seen how agencies could
drop out if a new manager came in who was not keen on the project. Several of the
interviews with agencies where it had been an undoubted success revealed a manager
who was ready to make a step-change in their organisation‘s use of ICT and to
champion it, against opposition if necessary. Generally these managers are also
confident in their own use of ICT, but certainly do not need to be experts.

I was the most enthusiastic person at the beginning, but now all the committee are on
board.

The project was timely. We wanted to change.

With this level of management support in the participating agency, a project which
offers good quality training, accessible technical support and appropriate software can
then make an enormous contribution. Without it, those inputs will still be valuable,
but their overall effect will be less.

Where the project has failed to have the desired impact, the reasons vary from
organisation to organisation, but seem to have been largely dependent on issues
outside the SRU‘s control. These include practical things like a change of premises or
                                           29


organisational structure. They also include management issues, such as priorities
being focused elsewhere — on the requirements of a parent body, for example, or on
funding issues — or, quite simply, the project just coming along at the wrong time for
that organisation. In a small agency, constant client demands always come first, and
it is sometimes just not possible to divert sufficient attention into new developments,
however potentially beneficial in the long run.

The effect of a management that is really committed to the project, and to improving
its use of ICT can also be seen in the responses of different organisations to change.
Those that had not engaged fully tended to see this as a hurdle that was too difficult to
overcome; those that had learned about strategic planning for ICT and gained
confidence at all levels in the organisation


11. Additional lessons
Some of these points have been made previously, but are summarised here for
convenience. The others are drawn from the material above, and it is hoped that the
reasoning behind them will be obvious:

     Any project seeking to engage in long-term development work with a small
      group of agencies should plan for a significant level of turbulence, and the
      smaller the agencies, the more likely this is. This planning should include both
      an acceptance that agencies are likely to drop out, or start to participate less
      fully in the project, and active steps to re-engage with projects when their
      circumstances change.

     The Where IT's @ project demonstrates that an initiative such as this has to be
      on a rolling programme if agencies that drop out are to be replaced. This also
      means that the project has a much better chance of success because it can be
      introduced at the right time for each participating agency, rather than just when
      the money happens to be available. Any attempt to deliver similar benefits with
      funding in fixed blocks over a short time period would be much less likely to
      succeed.

     There is a difficult balance to strike between keeping in active and regular
      contact, and monitoring to the extent that it becomes intrusive. Sometimes
      there may be no option other than to wait to be told about changes or problems.

     For many agencies the availability of helpful, accessible, hands on ICT support
      helped to give them confidence in the project. By providing them tangible
      benefits it increased their interest in the project and bolstered their
      participation. The fact that it was never intended to be a full support service
      disappointed some, but the experience of receiving good quality support did help
      many to identify (and in a significant number of cases budget for meeting) their
      more general support needs.

     Different people and organisations have different attitudes to using ICT and to
      seeking information electronically. However good the facilities on offer, it is
                                        30


    unreasonable to expect everyone to embrace them enthusiastically.
    Encouragement and training must be part of the package, but even then some
    agencies will participate more enthusiastically than others.

   It is inevitable that projects based around technology must take some technical
    risks, and should not be castigated if hindsight identifies better alternatives.
    Waiting until the perfect moment could mean waiting for ever, or certainly
    introducing unreasonable delays. The lessons learned from the two years of
    experience from 2002 to 2004 have meant a much smoother path for those
    agencies that joined the project later.

   A consequence of the above point is that projects must respond to aspects that
    work less well, and must be flexible and prepared to learn from experience.
                                          31



Appendix A: Agencies participating in Where IT's @
(Those in bold participated throughout)

              January 2002                               March 2006
     Action and Rights for Disabled
      People
                                             African Community Welfare
                                              Association
     Age Concern                            Age Concern
     Apna Ghar                              Aanchal (name change)
     Bengali National Association
     Bow County Court Advice Service
     Breakthrough Advice Service
     Cairde na nGael                       
                                            Cairde na nGael
     Choice 136                            
                                            Choice 136
     Community Links Advice Team           
                                            Community Links Advice Team
                                            
                                            Congolese Refugee Women‘s
                                            Association
     CORECOG                              CORECOG
                                           East London Financial Inclusion Unit
                       East London Black Women‘s Organisation*
     Eastwards Trust
     Hand in Hand
                           Howard Simons Advice Service*
                                         Kenya Community Support Group
     London East Aids Network           Positive East (name change)
     Newham Association of Disabled
      People
                                         Newham Asian Women‘s Project
                                         Newham Carers Network
     Newham Citizens Advice             Newham Citizens Advice Bureau
      Bureau
     Newham Tenants and Residents  Newham Tenants and Residents
      Federation                           Federation
                                         Newham United Tamil Association
     Newham Women‘s Refuge
     Renewal Refugee and Migrant        Renewal Refugee and Migrant
      Project                              Project
                                         Roma Support Group
     Stratford Advice Arcade            Stratford Advice Arcade
                (delivery partner: houses several participants
                       but does not have own advisers)
     Step Up
                                         Uganda Asylum Seekers‘ Association
* Joined after the start but left before March 2006
                                           32



Appendix B: Review methodology
The work programme for this review was carried out in April and May 2006, and
included:
    Desk review of documentation provided by the SRU.
    Telephone interviews with managers from the currently-participating agencies.
    A questionnaire for staff of the currently-participating agencies.

Initially two focus groups were also proposed, but this was later dropped.


Desk review
The SRU provided background descriptions of the project and its history, copies of
newsletters sent to participating agencies, and a detailed description of the technical
problems faced by the SDSL installation, and the measures taken to resolve them.
These provided valuable background in the preparation of this report and of the
interview and questionnaire materials.


Telephone interviews
These were carried out by Paul Ticher and Gill Taylor, using a standard structure.
The questions were outlined in the structure, but not scripted, to allow flexibility and
a more natural conversation.

The questions covered were:
   Describe the current state of the agency — size, staffing, activities, etc.
   How many staff provide advice and/or information? What proportion of the
     total?
   Has the interviewee been with the agency since it joined Where IT's @?
   How much does the interviewee know about the organisation‘s participation in
     the project?
   How has the agency changed over the past five years (moving premises,
     expansion, contraction, change of emphasis, etc.)?
   Describe the organisation‘s current ICT and effectiveness of use (using a series
     of indicators as prompts).
   How much help was the initial consultancy?
   How well do ICT problems get solved in the agency?
   If the agency has used the Where IT's @ technical support how much help has it
     been?
   How ICT literate/confident is the interviewee?
   How much help has the software been? How much is it used?
   Was the offer of training taken up?
   If so, how much help has the training been?
   Has the project enabled managers to have a better picture of their organisation‘s
     activities?
   Is the organisation confident that the security of information has been
     maintained?
                                            33


     What would have happened without the Where IT's @ project?
     How has the project helped your organisation do things better?
     How has it helped your organisation do better things?
     How has the project affected your finances? Are you now able to cost and fund
      ICT better than before, or is it a drain on resources?

Managers were alerted by e-mail to the review exercise before being phoned to make
an appointment for the interview. The intention was to speak to the Chief Officer of
each organisation, since they were felt to have the best overview of the impact of the
project on their organisation (and details about its operation would be covered in the
questionnaire survey).

Where the chief officer was unavailable within a reasonable time, or inappropriate for
the interview, agencies were asked to put up a suitable senior manager. They were
strongly discouraged from delegating the task to an ―ICT expert‖, with the
reassurance that the issues were not to be discussed at a technical level, and that the
manager‘s overview was felt to be more useful.

Inevitably, with the pressures of working in small organisations, it proved hard to
make contact with some managers or to find suitable times for interviews, but in the
end, by the beginning of May, 16 interviews were carried out. Those agencies omitted
were Stratford Advice Arcade (by design, since this agency had been a provider of
facilities rather than a user of the software) and three small agencies which had either
just joined the project or just had a change of manager.

In the event, ten of the interviews were with the chief officer, the remaining six with
people in a variety of posts. The interviews lasted between half and hour and an hour.

Detailed notes were taken, and used to provide the direct quotations in this report.


Questionnaire
The questionnaire survey was intended to look in more detail at the software, in
particular, and other aspects of the project from the point of view of advice staff in the
agencies.

Draft questionnaires were prepared in early April, and piloted by the SRU among
advice-work colleagues who were familiar with the software. This exercise suggested
that the questions were clear and relevant. A few minor changes were made, and the
questionnaire finalised. (See Appendix C.) The questionnaire was then amalgamated
with a second questionnaire being administered concurrently, looking at the use of the
SRU web site. Although this made the resulting questionnaire longer, it was felt,
possibly erroneously, that respondents would prefer filling in one long questionnaire to
two shorter ones.

The questionnaire was then made available on-line and e-mails with a link to the
questionnaire were sent to the managers who had been interviewed, asking them to
distribute the information to their advice workers and to encourage their staff to fill in
                                           34


the questionnaire. The managers had been forewarned that this would happen, at the
end of their telephone interviews.

This exercise only produced about half a dozen immediate results, so the managers
were sent a reminder e-mail by the review team, and also encouraged by the SRU to
get their staff to complete the questionnaire. This brought responses up to about a
dozen.

A final effort was then made, by sending each agency one, two or three paper versions
of the questionnaire (depending on the size of the agency), with replied-paid envelopes,
in the hope that some respondents who had been unable or unsure about completing a
questionnaire on line might be happier with a paper one. This resulted in six paper
responses and one or two more by e-mail, for a grand total of 20 (from eight different
agencies).

Potentially the questionnaire could have been completed by about 120 paid and
volunteer advice work staff from the 16 agencies interviewed. The response rate is
therefore around 16%. The material gathered was interesting, and contributed a lot to
the information available for this review. The exercise was therefore worth carrying
out. A response rate of 16%, after considerable prompting (and the incentive of a draw
for a relevant reference book), is however disappointing. The length of the
questionnaire, at 89 questions, is almost certainly one factor, although the on line
version was less onerous in that it had built-in routeing so that respondents were not
presented with questions which previous answers had indicated would be irrelevant to
them.

It is also noticeable that the overwhelming number of responses came from agencies
where the project had been an undoubted success. This suggests that managers and
staff at those agencies felt sufficiently committed to the Where IT's @ project to spend
time responding to the questionnaire, whereas those where the project had had less
impact felt less of an obligation.

This bias in favour of agencies where the project has worked well has been taken into
account in the discussion above.

The questionnaire was analysed using the Snap package, but in view of the small
number of replies, no cross-tabulations were carried out as it was felt that the data
generated would be unreliable.


Focus groups
It was originally intended to hold two half-day focus groups with users of the software
to explore in more detail what they liked about it and what they did not. Participants
were to be recruited from respondents to the questionnaire who indicated a
willingness to take part. In the event, only two volunteers came forward, out of the 20
questionnaire respondents. It was felt that the effort involved in recruiting more, in
view of the lack of response to the questionnaire, would be unproductive.
                                              35



Appendix C: Questionnaire
(Note: the text and style are identical to the paper version sent out, but the pagination
is not the same.)

                    Newham Council Social Regeneration Unit
                        Where IT’s @ review and evaluation

Introduction
Your organisation has been participating in the Where IT’s @ programme of ICT support
for advice workers and their colleagues. The programme is run by Newham SRU who have
asked an independent person, Paul Ticher, to review the project, and we are hoping you
can help us by answering these questions. It should take no more than 15 – 25 minutes,
and possibly a lot less.

Even if some of the questions don’t apply to you, or if you have not had much contact with
the project, it would still be helpful if you could answer as many questions as possible.

Your answers will be confidential; statistics will be shared with the council, but not your
individual replies. A summary of Paul Ticher’s report will be made available on the SRU
web site at http://sru.newham.gov.uk

Every completed questionnaire will be entered for a prize draw. The winner will receive a
copy of the Disability Rights Handbook for their organisation.


A: About the software
The Where IT's @ project made software available to the participating agencies. We would
like to know what you thought of each of the products you used.

AIMS
1.   Have you used AIMS at all?      Yes       No (if no, go straight to Question 12)

2.   How much do you normally use AIMS?
      Every day  Several times a week             Several times a month        Less often

3.   What do you use AIMS for? (Tick all that apply)
      Entering data about your own clients
      Entering data about other colleagues’ clients
      Using data about individual clients (to send letters, for example)
      Using data about groups of clients (to produce statistics, for example)
      Other (please describe):
                                            36




4.    How useful is AIMS?
       Very useful  Quite useful          Not very useful     Not useful at all

5.    How does AIMS compare with the case recording system you had before?
       A big improvement                Worse than before
       A bit of an improvement          I haven’t used any system except AIMS
       No different really

6.    To what extent does AIMS provide you with better information about your service
      users and their needs?
       A lot             Quite a lot    Some              Not much

7.    To what extent has AIMS improved your organisation’s ability to collate information
      about its work and influence long-term planning of services?
       A lot            Quite a lot       Some              Not much

8.    How easy is it to use the AIMS software?
       Very easy          Quite easy      Not very easy  Not easy at all

9. How could the AIMS software be improved?


10.   Do you ever find that you need to use the AIMS software but can’t access it?
              No             Yes, occasionally        Yes, frequently

11.   If you have had problems, what were the most common ones?
       Someone else was using the system
       I hadn’t been trained on the system and couldn’t make it work
       I had been trained but couldn’t remember how to make it work
       The software itself didn’t work properly
       Something technical but I don’t know exactly what
       My organisation’s computer system wasn’t working
       My organisation’s internet connection wasn’t working
       The server at London Borough of Newham wasn’t working
       Other (please describe):


The Lisson Grove benefits and tax credit software
12.   Have you used the Lisson Grove software at all?  Yes  No (if no, go to Q. 21)

13.   How much do you normally use the Lisson Grove software?
       Every day  Several times a week        Several times a month         Less often
                                             37




14.   What do you use the Lisson Grove software for? (Please tick all that apply)
       Benefits calculations
       Tax credit calculations
       Quick benefits calculations
       Other (please describe):

15.   How useful is the Lisson Grove software?
       Very useful  Quite useful          Not very useful      Not useful at all

16.   How does the Lisson Grove software compare with the way you used to calculate
      benefits (either manually or using a different program)?
       A big improvement                     Worse than before
       A bit of an improvement               I have never calculated benefits any other
way
       No different really

17.   How easy is it to use the Lisson Grove software?
       Very easy        Quite easy         Not very easy       Not easy at all

18.   How could the Lisson Grove software be improved?




19.   Do you ever find that you need to use the Lisson Grove software but can’t access it?
              No             Yes, occasionally         Yes, frequently

20.   If you have had problems, what were the most common ones?
       Someone else was using the system
       I hadn’t been trained on the system and couldn’t make it work
       I had been trained but couldn’t remember how to make it work
       The software itself didn’t work properly
       Something technical but I don’t know exactly what
       My organisation’s computer system wasn’t working
       My organisation’s internet connection wasn’t working
       The server at London Borough of Newham wasn’t working
       Other (please describe):


The CPAG welfare benefits CD-ROM
21. Have you used the CPAG CD-ROM at all?  Yes              No (if no, go to Question
29)
                                             38


22.   How much do you normally use the CPAG CD-ROM?
       Every day  Several times a week      Several times a month           Less often

23.   How useful is the CPAG CD-ROM?
       Very useful  Quite useful          Not very useful      Not useful at all

24.   How does the CPAG CD-ROM compare with the printed handbooks?
       A big improvement            Worse than the handbook
       A bit of an improvement      I haven’t used the printed handbooks
       No different really

25.   How easy is it to find what you are looking for in the CPAG CD-ROM?
       Very easy         Quite easy        Not very easy        Not easy at all

26.   How could the CPAG CD-ROM be improved?

27.   Do you ever find that you need to use the CPAG CD-ROM but can’t access it?
              No             Yes, occasionally       Yes, frequently

28.   If you have had problems, what were the most common ones?
       Someone else was using the system
       I hadn’t been trained on the system and couldn’t make it work
       I had been trained but couldn’t remember how to make it work
       The software itself didn’t work properly
       Something technical but I don’t know exactly what
       My organisation’s computer system wasn’t working
       My organisation’s internet connection wasn’t working
       The server at London Borough of Newham wasn’t working
       Other (please describe):


AdviserNet (formerly called the Citizens Advice Electronic Information System)
29. Have you used AdviserNet at all?               Yes      No (if no, go to Question
37)

30.   How much do you normally use AdviserNet?
       Every day  Several times a week       Several times a month          Less often

31.   How useful is AdviserNet?
       Very useful  Quite useful          Not very useful      Not useful at all

32.   How does AdviserNet compare with the paper version of the information system?
       A big improvement                Worse than the paper version
                                             39


       A bit of an improvement             I haven’t used the paper version
       No different really

33.   How easy is it to find what you are looking for in AdviserNet?
       Very easy         Quite easy        Not very easy        Not easy at all

34.   How could AdviserNet be improved?


35.   Do you ever find that you need to use AdviserNet but can’t access it?
              No             Yes, occasionally         Yes, frequently

36.   If you have had problems, what were the most common ones?
       Someone else was using the system
       I hadn’t been trained on the system and couldn’t make it work
       I had been trained but couldn’t remember how to make it work
       The software itself didn’t work properly
       Something technical but I don’t know exactly what
       My organisation’s computer system wasn’t working
       My organisation’s internet connection wasn’t working
       The server at London Borough of Newham wasn’t working
       Other (please describe):

B: About the training
Over the course of the Where IT's @ project so far there have been several training
opportunities, about using the internet, about the programs provided through the project,
and about managing computers. We would like to know what you thought about the ones
you attended.

Here’s Where IT's @
37.   Did you attend either of the Here’s Where IT's @ training courses:
      Using the internet for advisers       Yes          No
      Using advice software                 Yes          No

38.   How good was the Here’s Where IT's @ training?
       Excellent      Good              Average             Not good at all

39.   How much better could you use the internet for your advice work afterwards?
       A lot better    A bit better     No real difference

40.   How could the Here’s Where IT's @ training have been improved?
                                             40


AIMS
41.   Did you attend either of the following AIMS training courses:
      AIMS Administrator                     Yes         No
      AIMS Reports                           Yes         No

42.   How good was the AIMS training?
       Excellent      Good                Average           Not good at all

43.   How much better could you do your work (those parts that relate to AIMS) after the
      AIMS training?
       A lot better    A bit better    No real difference

44.   How could the AIMS training have been improved?



Lisson Grove Quick Benefit Calculator
45.   Did you attend the Lisson Grove training?             Yes  No (go to Q.49)

46.   How good was the Lisson Grove training?
       Excellent      Good              Average             Not good at all

47.   How much better could you do your benefits calculations after the Lisson Grove
      training?
       A lot better    A bit better    No real difference

48.   How could the Lisson Grove training have been improved?



ICT management training
49.   Did you attend either of the following ICT management training courses
      Managing and Financing ICT (2002)  Yes           No
      Managing ICT (March 2006)              Yes       No

50.   How good was the ICT management training?
       Excellent      Good            Average               Not good at all

51.   How much better could you manage and budget for your ICT after the training?
       A lot better    A bit better   No real difference

52.   How could the ICT management training have been improved?
                                            41




Other training
53.   Was there any other training that you think should have been made available which
      would have helped you make better use of your computers for advice and
      information work?


54.   Would you now be interested in a briefing session on how to use the SRU web site?
       Definitely     Possibly           No


C: About the Where IT's @ project overall
In this section we ask how useful you feel the Where IT's @ project has been overall.

55.   How much benefit has the project been to your organisation?
       A lot      Quite a lot         Some        Not much            Don’t know

56.   How much benefit has the project been to your service users?
       A lot      Quite a lot         Some        Not much            Don’t know

57.   What has been the best thing about the Where IT's @ project?




58.   What has been the worst thing about the Where IT's @ project?




59.   Do you have any other comments on the Where IT's @ project?


D: About your organisation’s computers and internet access
In this section we are asking about ICT (information and communications technology) in
general — in other words your organisation’s computers and internet access.

60.   How good do you think your organisation’s provision of ICT is, in general? (For
      example: are there enough computers? are they up to date? do they have good
      software? how does it compare with other organisations you may have worked in?)
       Excellent        Good            Average                    Not good at all

61.   How suitable is the ICT available to you for doing your work? (For example: do you
      have the use of a computer whenever you need it? is your computer fast enough to
                                              42


      do the work? is everything set up to ensure that you avoid health and safety
      problems? if you have a disability, is the hardware or software adapted to enable you
      to work it properly?)
       Ideal             Mostly fine         Not good enough        Not good at all

62.   When ICT problems occur in your organisation, how often to they get resolved
      quickly and expertly?
       Always            Usually        Sometimes         Not usually

63.   Who deals with computer problems in your organisation? (Please tick all that apply)
       Don’t know
       An in-house computer expert
       A member of staff who happens to know more about computers than others
       An outside maintenance or support company
       Someone from another voluntary organisation
       A volunteer
       A friend or relative of someone on the staff
       Other: (please describe)


64.   How often does your ICT suffer from any of the following problems:
      Computer viruses                 Often        Occasionally      Rarely or never
      Unwanted e-mail (spam)           Often        Occasionally      Rarely or never
      Computer files getting lost      Often        Occasionally      Rarely or never

65.   How much better is your organisation using ICT now than it was five years ago (or
      when you started)?
       A lot        Quite a lot       Some         Not much          Don’t know

66.  Which of the following statements best describes the access you have to the internet
     for your work (whether you are paid or voluntary)?
      The organisation provides a computer with internet access, just for myself
      The organisation provides a computer with internet access, which I share with
others
      I have to use my own home computer if I need to access the internet
      I have the use of a computer but no internet access
      I don’t use a computer at all

67.   Do you use the internet to help with your work? (Please tick all that apply)
       Yes, I use it to help clients directly (by filling in forms on line, for example)
       Yes, I use it to look up information that clients, service users or enquirers need
       Yes, I use it to look up information that I need in the course of my work
                                              43


       Yes, I use it to find out about training courses and other resources to help me
         work better

68.  How important is the internet to your work?
      I always think of using the internet first when I need information
      Sometimes I use the internet first and sometimes I use paper information first
      I always think of using paper information first, but I can use the internet if
necessary
      It never occurs to me to use the internet

69.   How confident do you feel about using the Internet?
       Very confident; I use it all the time and know how to find what I am looking for
       Quite confident; I use it a lot and can usually find what I am looking for
       Not very confident; I do use it, but sometimes I can’t make it work
       Not confident at all; I try to avoid using the internet if I can


E: About your use of the SRU web site
70.   Have you heard of the SRU web site?
       Yes             No              I think so, but I’m not sure
      (If No, please go to Question 83)

71.   How did you hear of the SRU web site?
       I can’t remember                               From a colleague
       From information sent by the SRU               At a meeting or training course
       Other (please describe):


72.   How often do you use the SRU web site?
       Most days     Most weeks         Most months            Less than that

73.   Which features of the SRU web site do you use?
      Look at the What’s new page                Often            Sometimes      Never
      Download publications                      Often            Sometimes      Never
      Order publications                         Often            Sometimes      Never
      See what training courses are available    Often            Sometimes      Never
      Book a training course                     Often            Sometimes      Never
      Get up-to-date information on the NRF Poverty
         and Income Maximisation Programme  Often               Sometimes  Never
      Find out when the next welfare rights and
         money advice forums are taking place  Often            Sometimes  Never
      Download forum agendas and minutes         Often          Sometimes  Never
      E-mail the SRU with queries on benefits
                                            44


        advice and money advice                   Often      Sometimes  Never

74.   When you want to use the SRU web site the first page asks you to log on. How does
      this affect your use of the site?
       It’s no problem at all
       It’s a bit inconvenient but doesn’t stop me
       I sometimes don’t bother because logging on is too inconvenient
       I did succeed in logging on once or twice but now I can’t remember how to
       I’ve never managed to log on

75.   Other than the SRU web site, which three web sites do you find most useful in your
      advice work?




76.   How easy is it to use the SRU web site?
       Very easy        Quite easy once you get the hang of it    Not very easy

77.   What is missing from the SRU web site that you would like to see included?




78.   What would make it easier to use the SRU web site?




79.   What technical problems have you had in using the SRU web site?




80.   Have you ever tried to download a .pdf from the SRU web site?
       Yes, and it worked fine                      No
       Yes, but it didn’t work                      I don’t understand the question
         Please say why:


81.   If you said Yes to the question above, which version of Adobe Acrobat do you have?


82.   Do you have any other comments on the SRU web site?
45
                                           46



F: About you
83. Which organisation do you work for (either paid or voluntary)? (We need to know this
in case you win the Disability Rights Handbook in the prize draw.)
       African Community Welfare Association
       Age Concern
       Aanchal
       Cairde na nGael
       Choice 136
       Community Links
       Congolese Refugee Women’s Association
       CORECOG
       East London Financial Inclusion Unit
       Kenya Community Support Group
       Positive East
       Newham Asian Women’s Project
       Newham Carers Network
       Newham Citizens Advice Bureau
       Newham Tenants and Residents Federation
       Newham United Tamil Association
       Renewal Refugee and Migrant Project
       Roma Support Group
       Uganda Asylum Seeker’s Association
       Other (please say which):

84.   Are you:
       Paid full time    Paid part time  A volunteer

85.   Is your main role in the organisation:
       Advice-giving  Information-giving/sign-posting  Administration

86.   Is your main area of expertise:
       Welfare benefits  Housing        Immigration      Health
       Other (please describe)

87.   How confident do you feel about using computers?
       Expert          Pretty good       A bit nervous  Very nervous

88.   How has your computer confidence changed over the last three years?
       Much more confident  A bit more confident         No real change
                    47


Many thanks for completing this evaluation

				
DOCUMENT INFO