Title Page

Document Sample
Title Page Powered By Docstoc
					Assessing Institutional
 Digital Assets (AIDA)

    A JISC project
     Final Report
        Written by Ed Pinsent
     Contact e.pinsent@ulcc.ac.uk
              May 2009
                                         AIDA draft report version 1, 2 May 2009




Table of contents
Table of contents ............................................................................................................ 2
Acknowledgements ........................................................................................................ 3
Executive Summary ....................................................................................................... 4
Background .................................................................................................................... 6
Aims and Objectives .................................................................................................... 10
Methodology ................................................................................................................ 11
Implementation ............................................................................................................ 13
Outputs and Results ..................................................................................................... 20
Outcomes ..................................................................................................................... 22
Conclusions .................................................................................................................. 24
Implications.................................................................................................................. 25
Recommendations ........................................................................................................ 26
References .................................................................................................................... 27
Appendixes .................................................................................................................. 28
APPENDIX 1 ............................................................................................................... 29
APPENDIX 2 ............................................................................................................... 33
APPENDIX 3 ............................................................................................................... 34




                                                                2
                            AIDA draft report version 1, 2 May 2009




Acknowledgements
The AIDA project has been funded by the JISC under its Digital Preservation and
Records Management Programme.
http://www.jisc.ac.uk/whatwedo/programmes/preservation/aida.aspx
http://www.jisc.ac.uk/whatwedo/programmes/preservation.aspx


The project ran 1 October 2007 - 31 March 2009.
Project website: http://aida.jiscinvolve.org/
Project Director: Kevin Ashley
Project Manager: Ed Pinsent
With thanks to Ann Barrett, Ann E Jones, Michelle Alexander, Steve Williams and
Sue Hodges for kindly participating as case studies.
At ULCC, Colin Love gave essential help for the Technology leg of the toolkit. Silvia
Arango-Docio designed the scorecard spreadsheet, and Patricia Sleeman gave much
valuable administrative and QA support to the project.




                                                    University of London Computer Centre
                                                                         20 Guilford Street
                                                                      London WC1N 1DZ
                                                                             0207 692 1345
                                                                     http://www.ulcc.ac.uk



                                              3
                            AIDA draft report version 1, 2 May 2009




Executive Summary
AIDA’s guiding principle is that it’s not possible to have a one-size-fits-all approach
to digital asset assessment. We believe that an institution needs insight into its
capabilities and maturity before it can decide on appropriate mechanisms or tools to
use in resolving digital asset management issues. That belief was based in part on our
experience with Cornell’s 3-legged, 5-stage maturity model used in digital
preservation and partly on our knowledge of reactions to tools such as LIFE, DROID
and Drambora. Some immediately appreciated the potential value of one or more of
these tools; some believed that tools required too much knowledge to use; and others
sometimes felt that some tools were unlikely to tell them anything that they didn’t
already know. None of these assessments were wrong – each tool is aimed at an
audience with a different set of skills and requirements, whether or not those
assumptions are explicitly stated.
The project sought to validate this approach through the development and testing of a
self-assessment toolkit. Tests were carried out in a variety of institutional settings,
after which the toolkit was revised and re-tested. One finding from the initial tests was
that we needed to adapt the tool so that it could be used at a level below that of the
institution – that of a project, a department, or a service.
Once the self-assessment toolkit approach was validated, we worked to produce some
companion resources. The self-help toolkit provides guidance on appropriate steps to
move an institution between steps on the maturity model. It can aid institutions in a
process of continual improvement. The toolkit was also enhanced by an initial
assessment of asset management and assessment tools which are appropriate to
organisations which are at different stages of maturity and capability with respect to
digital asset management.
AIDA has delivered a workable toolkit which will allow institutions based in UK
Higher and Further Education to perform an assessment of their capability to manage
their digital assets. Unlike other assessment surveys, AIDA is not focussed on
individual asset types, nor on file formats, nor on extant systems for managing assets.
Instead, it takes a holistic view using a method that encompasses an understanding of
organisational, technological, and resource issues. The project has arrived at the
toolkit from a distillation of currently available standards, projects, toolkits and
methodologies.
We recognise that the tools themselves could be delivered more advantageously in
different ways. Drambora’s online tool, which also allows for repeat assessments with
a view to process improvement, was an inspirational model. We’ve begun work on a
cloud-based version of the tool using Google Docs and wish to develop a community-
driven approach to the matching of tools with maturity levels. ULCC intend to
develop an interactive version of the tool in the coming months.
AIDA's contribution has not been to provide anything especially innovative in the
field of digital asset management, but it has hopefully taken a step towards making
the concepts more approachable and understandable to a wider range of professionals.
Many tools exist; the challenge can be in identifying those that are appropriate to your
own circumstances. If one has just had training in basic first aid, the tools used by a
typical neurosurgeon are unlikely to be useable, or even comprehensible. So it is with
digital assets.


                                              4
                            AIDA draft report version 1, 2 May 2009



The toolkit would still benefit from further testing, but our experience has shown that
the balanced three-leg approach is sound and stands a chance of working as part of an
assessment methodology. Our matching of tools to maturity levels has not received
wide validation and will undoubtedly be contested by some. We would welcome
further debate and engagement on these issues which can only help to improve the
value of AIDA and that utility of the tools it describes.




                                              5
                             AIDA draft report version 1, 2 May 2009




Background
The need is for institutions to measure their overall capability for asset management;
to assess their ability to manage their digital assets. Recognising current institutional
capabilities is an essential pre-requisite for taking effective decisions about how to
preserve assets. We know that institutions have digital assets, but we also know they
are being produced in large quantities, in numerous places, by many diverse creators,
and for lots of different purposes and different audiences or users. Institutions differ in
how centrally these activities are managed and coordinated, and in the way in which
knowledge is spread throughout the organisation.
There are a plethora of tools available for dealing with different aspects of the
problem, but some of the tools themselves require knowledge or resources that are not
available in all settings. LIFE (a costing tool) is only useful if you have a good sense
of the type and quantity of resources that you are responsible for. Not everyone has
this knowledge – tools such as DAF (the Data Audit Framework) have been
developed to help with that problem of building up inventories. Some tools are good
at describing these pre-requisites and some are not. AIDA attempts to help an
institution understand how far it’s already got, where it needs to go next and what
other tools might help it get there.
AIDA is a successor project to the Digital Asset Assessment Tool (DAAT), a project
also managed by ULCC and funded in the JISC 4/04 programme. DAAT tried to deal
with the problem of building asset inventories and assessing preservation risk,
building on a model used for traditional library, archive and museum resources
developed by the National Preservation Office. That model struggled in the digital
domain, and our experience with it showed that many of its potential users were
unable to answer the basic questions necessary to determine the level of risk to their
assets. (Such problems could also arise with traditional materials, but were less likely
to; the NPO provided a paid-for service to allow institutions in such settings to use the
tool.) However, our report raised a number of pertinent questions and implications,
some of which have fed directly into AIDA. We learned that we need to survey entire
systems, not just selected objects; we need to find more assets by widening the range
of the survey; and that a better approach would incorporate organisational models and
integrated technological tools, including (for example) media testing and technology
watch. We also realised that for some, any of these approaches was too demanding.
They needed to start somewhere else.
AIDA tries to address a requirement expressed by JISC and others that there is a need
to include a wider range of institutional information professionals, such as archivists
and records managers, in the process; and make digital asset management more
accessible to them. Quite often it is those staff with traditional paper-based
information management skills (archives, RM, library) who feel they need to learn the
most about the management of digital objects, including their preservation. AIDA
may not be saying anything radically new, but it is saying it in a language and a
format that is approachable by more people (rather than something exclusively
predicated on repository managers or system administrators), and in ways which make
the concepts graspable by more participants.
If inclusiveness is one of the issues, we think that the Digital Preservation Training
Programme (DPTP), which is run by ULCC, embodies that inclusiveness. Over time



                                               6
AIDA draft report version 1, 2 May 2009




                                          Figure 1




                                            Figure 2




                  7
                            AIDA draft report version 1, 2 May 2009



we have had a wide range of attendance at these programmes, with course participants
from the government and private sectors, representing such diverse skills as
librarianship, digitisation, web archiving, records management, project management,
and more. At this programme, as a way of teaching an understanding of the OAIS,
we've been using key concepts which originated from Cornell University.
Anne Kenney and Nancy McGovern of Cornell University had developed two very
strong and easily-apprehensible models, for describing institutional readiness and
capabilities for digital preservation. The two models are a 3-legged stool illustrating
three essential areas (resources, technology and organisation) and a 5-stage scale of
readiness or maturity which relates to activities in each of these areas.
The strength of the three legs model (see Figure 1) is that it teaches that organisation,
technology and resources are all equally important components in digital preservation
(most people assume that strong technology is the only answer; others think it best to
throw money at the problem). The visual metaphor of the stool makes it clear that an
imbalance in any of these aspects – too much technology, or too little organisational
readiness, for example – leads to instability. The five-stage scale (see Figure 2) is also
a very useful structured way of measuring maturity, or capability and provides a clear
sense of how to move things forward within an institution in order to improve digital
preservation.
The AIDA project believes these models can be adapted to deal with the more general
problem of digital asset management, of which digital preservation is but one
component.
AIDA's thinking was that we could crosswalk the three legs and five stages against
existing methodologies, toolkits and projects. At the same time we had to keep our
focus on asset management, not just digital preservation; DP is one possible outcome
of asset management, which implies (for example) maintenance, access, re-use,
disposal and destruction as other possible outcomes.
There was an implied need to understand when to use tools, such as PRESERV and
PRONOM (automated file format expiry alert systems); the DPC decision tree and
JISC Lifecycle guidance (mostly to do with assisting institutional selection decisions);
and LIFE (understanding the cost implications of preservation decisions).
New tools are continually becoming available, and two of particular relevance were
released whilst AIDA was in development. Drambora and the Data Audit Framework
(DAF) both address aspects of the asset management problem and DAF at least is
targeted at HE institutions. But we would maintain that neither are immediately useful
in all settings. Drambora requires a good deal of self-insight and preparedness to be
able to provide the information and evidence needed for it to work effectively, and
institutions that are just beginning to deal with the problem are likely to find it too
challenging to use. DAF, on the other hand, will be of value in such settings, but will
have less to tell a research institution which already has effective inventories and asset
management policies in place. Yet Drambora can still be of use to such an institution.
The main differences are:
Drambora sees everything as an extension of risk management (what is going to
happen to your institution if you don't manage these assets?), whereas in AIDA risk
management is just one of many things to worry about. Drambora is predicated on
the Trusted Repository model and may prove more challenging to apply in settings
which do not map well to that worldview.


                                              8
                            AIDA draft report version 1, 2 May 2009



DAF is concerned with only kind of asset, research data; where AIDA is interested in
many kinds of assets (datasets, records, images, libraries, papers, digitised collections,
learning objects). DAF is auditing research data with a view to ascertaining and
measuring their value to academics and for research purposes; and the aim is to
improve the management of this data. One of its outputs will be a software tool that
supports and facilitates data audit with this in mind. DAF helps an institution to
ensure that valuable research data is preserved and remains accessible over time.
AIDA suggests using Drambora and DAF as supplementing parts of our assessment,
to gain more information about certain aspects of institutional behaviour and to
improve behaviours in these areas.




                                              9
                             AIDA draft report version 1, 2 May 2009




Aims and Objectives
The central aim of the AIDA project was the production of an assessment toolkit, to
enable institutions to perform self-assessment of their readiness for digital asset
management. The toolkit was to be tested internally at ULCC, in partner institutions,
and externally. The project plan spoke in terms largely of digital preservation, but
AIDA changed to include management and creation of digital assets.
The project also had these specific aims and objectives:
      Brief fact-finding stage, to gain some familiarity with the sorts of digital assets
       being created in institutions and to gain knowledge of their requirements.
      Recruitment of case study test sites to develop and assess the toolkit and
       produce case studies from these testing operations.
      The production of a self-help guide with suggestions and recommendations
       about moving forward, and how to change.
      A decision making-tool which provides guidance on performing digital asset
       management.
      Collaboration with LIFE, Planets and Caspar.
      Dissemination and communication through a website.
      Workshop to present the toolkit.
Most, but not all, of these aims have been realised or are close to realisation. The self-
help guide and the accompanying matching of tools to maturity levels is undergoing
final QA and is intended for release towards the end of May 2009. The decision-
making tool’s function has been subsumed in the self-help guide. Promotion of the
products of AIDA has taken place, but the planned workshop is likely to be deferred
to summer 2009. Although formal collaboration with LIFE, Planets and Caspar was
not possible, AIDA staff have monitored the work of these projects and engaged
closely with some of them, particularly LIFE and its successors (LIFE2 and LIFE3).




                                               10
                            AIDA draft report version 1, 2 May 2009




Methodology
The overall methodology involved a cycle of internal development, accompanied by
research into applicable tools and models from elsewhere, external validation in
typical target sites, refinement based on feedback from evaluators, and a further cycle
of external validation and refinement. The external evaluations were offered support
from the project, but not all chose to take this up. The support extended to site visits
in some cases.
Model methodology
This task was to create an update on the Cornell model. The Cornell maturity model
was created with digital preservation in mind, but AIDA is concerned with asset
management and asset creation. The task was to modify Cornell's definitions,
descriptions and key indicators to be more generalised.
Toolkit methodology
Starting from the Cornell models (three legs, five stages), the task was to align them
with elements of asset management activity in an institution that could be audited.
These auditable elements were adapted and compressed from four extant sources
(TRAC, NESTOR, etc; see section on Implementation for full list). The toolkit was to
be updated and improved, with feedback on the process and the questions received
from the case studies. The initial feedback encouraged us to add concrete examples as
well as abstract description to the self-assessment tool. People found it easier to
recognise themselves or their institutions using this method.
Case studies methodology
We needed to recruit evaluators from HFE institutions in the UK, who would be
willing to act as participants and trial the toolkit within their own institutions. We
planned a recruitment drive in two stages, of which the second stage involved making
a callout on ULCC's dablog. The approach was to be targeted at people specifically
connected with asset management.
Scorecard methodology
We needed some way of analysing the scores, which we envisaged would require
some form of weighted scorecard.
Self-help guide methodology
The task was to devise guidelines that would help institutions move forward along the
stages path. Information was drawn from materials supporting the use of the model in
DPTP, from parallel tools and from evaluator’s own experiences.
Dissemination and communication methodology
Originally the intention was to add posts to dablog, tagging them with "AIDA"; as the
project progressed, its entire history could be retrieved from the blog with one single
search action (or could be expressed as a link such as
http://dablog.ulcc.ac.uk/tag/aida/). This strategy came to be replaced by building a
JISCinvolve website, and information and reports were filed there as completed.
Dissemination has also been supported through the use of external evaluation sites,
who have passed on knowledge and experience to others through formal and informal
methods.


                                              11
                          AIDA draft report version 1, 2 May 2009



Issues to be addressed by our methodology
The AIDA methodology has largely been a compression and condensation of existing
methods, standards and toolkits associated with digital preservation, adapting them
constantly to match digital asset management and needs of HFE institutions.




                                            12
                                   AIDA draft report version 1, 2 May 2009




Implementation
Update on Cornell Model
We re-examined the wording, generalised it where possible, to reflect where possible
the use, management and creation of digital assets in academic contexts. This resulted
in a five-page document to be exposed for comment on the AIDA website, where it
was published in May 2008 1. This exercise was very useful as a foundation to
building the self-assessment toolkit, which was the next stage of the project.
Building the self-assessment toolkit
The task was to devise a number of auditable elements within the Organisation,
Technology and Resources legs. Each auditable element had to be further expressed
to represent one of the five stages of compliance. The skeleton structure of the three
legs and five stages was to be fleshed out with other sources. The toolkit was to be
fleshed out further with examples gleaned from the case studies.
The four main sources used were:
         1: An Audit Checklist for the Certification of Trusted Digital Repositories.
         This is the document originally produced by RLG-NARA, which later became
         Trustworthy Repositories Audit and Certification (TRAC): Criteria and
         Checklist 2.
         2: The Cornell University Survey of Institutional Readiness.
         3: Summary of RLG-OCLC Framework Component Characteristics.
         4: Network of Expertise in long-term STORage (NESTOR). 3
Working with these sources, we underwent a process of comparison to find areas of
commonality among the statements, questions and elements found in these
documents. The spreadsheet (Appendix 1) exposes this comparison process.
To create the AIDA self-assessment toolkit, we generalised some elements and
worked on changing the wording so that they didn't refer exclusively to digital
preservation. This process involved the selection of some elements, and the exclusion
of others. There are also significant areas of activity suggested by TRAC, for
example, which aren't yet being included in AIDA; namely the Designated User
Community, the matter of object management, and Repository Management. Early
drafts of our toolkit included reference to the User Community, but this was later
dropped. We’re still concerned that this misses a critical aspect of asset management:
who are the assets being retained for and what do they intend to do with them? This
question doesn’t always have a straightforward answer, but sometimes it does.
Considering the question should help asset management decisions even if the question
cannot be answered.
The format of the AIDA toolkit was based very closely on toolkits used and produced
by the JISC-funded TrustDR project 4. For example:


1
  The document is available at http://aida.jiscinvolve.org/project-documents/.
2
  Published in 2007 by The Center for Research Libraries (CRL) and Online Computer Library Center, Inc.
(OCLC). See http://www.crl.edu/content.asp?l1=13&l2=58&l3=162&l4=91
3
  See http://www.langzeitarchivierung.de/index.php?newlang=eng.
4
  Trust in Digital Repositories, at http://trustdr.ulster.ac.uk/


                                                      13
                                  AIDA draft report version 1, 2 May 2009



         Analysis and audit tool for rights management in learning object repositories,
         by Jackie Proven, John Casey, and David Dripps. University of Ulster and
         UHI Millennium Institute. (Tool 2B)
         A Managed Learning Environment Integration Matrix. From the Scottish
         Funding Council E-Learning Implementation Guide. (Tool 1A) 5
Tool 2B above was originally devised for managing Intellectual Property Rights (IPR)
in digital learning objects. But it so happened that TrustDR used a structure that
matched very closely what AIDA wanted to achieve. For each IPR element, they
proposed five levels of implementation (supported with descriptive characteristics),
and these levels are very close to the Cornell five organisational stages. The TrustDR
toolkits also used indicators and exemplars, and comments from real world users 6.
Like TrustDR, the AIDA toolkit needed to express degrees of compliance, expressed
at particular aspects of management and preservation behaviour, including
technology. A senior staff member of ULCC's Infrastructure Services assisted with
QA of the technology leg.
At this time, we also devised an Inventory Questionnaire. This was a separate form
that incorporated some questions from the Cornell Survey of Institutional Readiness,
including for example questions on file formats and media storage. We decided this
output was not suitable for audit or assessment purposes, but we decided to carry it
forward as a tool that may have some value as part of self-improvement at a later
stage of the project.
The last step was to create a blank scorecard, using a Word document with protected
form fields.
We added five pages of introductory material to the toolkit to explain what it was
about and how an institution might use it. These preliminaries included:
        A definition of digital assets
        Various examples of digital assets
        An explanation of what is meant by digital asset management
        A set of general guidelines
In guiding our intended users, we suggested they look for the stage which best
describes the place they are working in. We added the caveat that there would never
be an 'exact match', and that participants would probably experience difficulty in
selecting one or the other. If they were hovering, our advice was not to spend too
much time deciding between stage 2 or 3, and to pick the lower of the two.
We also worked hard to encourage feedback on the process; asking participants to tell
us what it was like working through the toolkit, did they understand it, and was
everything clear.
Implementing case studies
Our initial thinking was that we would like the case studies to reflect as many stages
of ‘maturity’ as possible, with an institution such as UKDA representing one of the

5
 Both retrieved from http://trustdr.ulster.ac.uk/outputs.php on 25th April 2008.
6
 As AIDA becomes more developed and more widely used, we also hope to include such real-world statements
within the toolkit.


                                                    14
                                       AIDA draft report version 1, 2 May 2009



more mature and developed stages. Unfortunately we were not able to recruit an
institution with that level of maturity in asset management practice, which means that
the upper levels of the assessment tool have not been as rigorously tested.
Our recruitment drive happened in two separate stages. During Stage 1, around March
2008, we approached a number of institutions, from national libraries to individual
projects.
Some of these proved to be poor selections on ULCC's part, and some were picked
solely on the strength of the interesting nature of the collections held by the institution
(rather than their capacity to manage or preserve them as digital assets). Many of
these selections did not progress, despite some encouraging responses.
Stage 2, during June-July 2008, was more successful and recruited more participants.
We made a callout in June ('How safe are your digital assets?'), built the AIDA blog
website 7, and published a post on ULCC's dablog 8.
In addition, we proactively sent out a pro-forma email to some likely candidates.
Their selection was a rather laborious process, involving searching through the online
directories of Universities, looking at job titles to find members of staff likely to be
involved in, or interested in, digital asset management. In line with our guiding sense
of inclusiveness, we picked records managers, information managers, digital
librarians, repository managers, data curators, and web masters. Many responses at
this time came from records managers.
We soon received expressions of interest from:
       1. Heriot-Watt University
       2. King's College London (the CERCH project)
       3. University of Wales Swansea
       4. Wolfson College Oxford
       5. BL Endangered Archives Project
       6. University of Glasgow records management
       7. University of Dundee records management
The final list of AIDA participants was:
       1. Liverpool John Moores University Digital Repository
       2. Swansea University Archives
       3. Imperial College London
       4. Heriot-Watt University
       5. Liverpool University records management service
Case study onsite visits
We visited Imperial College in July 2008 and Liverpool's RM service in December
2008. The intention with onsite visits was to offer the institution an informal guided
walkthrough of some of the elements in the self-assessment exercise, how to interpret


7
    http://aida.jiscinvolve.org/
8
    Published 11th June 2008 at http://dablog.ulcc.ac.uk/2008/06/11/aida-call-for-volunteers/.


                                                           15
                                    AIDA draft report version 1, 2 May 2009



them, and how they can best be answered without expending too much effort. We
made some suggestions for where to look and who to speak to; explain and unpack
the assessment tool in terms of what it might mean to a University; suggest some
internal documentary sources that could be referred to or accessed, etc.
Scorecard implementation
The weighted scorecard was built using the in-house skills of a database specialist at
ULCC, who has experience of processing statistical information. We started with a
guide called 'Constructing a Weighted Matrix'. 9 This matrix was originally intended
to help evaluate IT products (such as a database and its vendor), but we found ways of
adapting it to our purposes. The AIDA scores simply match the five stages - assigning
one point for each 'Acknowledge' answer, up to five points for each 'Externalise'
answer. The three legs are equally weighted at five points each, thereby expressing
the stability of the three legs model. However, not all elements score the same. Each
element is weighted according to our semi-subjective criteria. Having an institutional
mission statement is considered more important than a policy review, for example, so
the first element is weighted with five points while the second is weighted with two.
Toolkit Mark II
Following the case studies, we were able to update the Toolkit with two significant
improvements:
        The addition of quotes from case studies to better illustrate certain stages
        The addition of a second departmental tier
The quotes from respondents were modified so as to make it impossible to identify the
parent institution in question. This is to protect the confidentiality of respondents, but
also because we want people using the tool to recognise situations, not institutions.
The second change, the addition of an assessment tier directed at individual
departments rather than the entire institution, improved the usefulness of the toolkit
immensely. This move had already been anticipated in the Project Plan:
"Although the project’s primary focus will be on use of the tool across an institution,
we recognize that digital preservation and asset management activities are often better
addressed in a narrower context, and the tool will be designed in a way that will allow
its use within smaller groups, such as a research group, department or faculty."
It was, however, specific comments from respondents that confirmed the need for this
re-design of the toolkit and resulted in the two-tier version. One such comment was:
        …for some questions, we had problems in deciding whether we were
        responding on behalf of an area or the digital repository or on behalf of
        the University. Where there has been concern I have noted that we have
        reached a certain stage for the digital repository and in fact the
        University might be at a stage below.
Another respondent made it clear, in some detail, why we needed the flexibility of the
two-response method:



9
 Designed by Craig Borysowich, retrieved from http://it.toolbox.com/blogs/enterprise-solutions/constructing-a-
weighted-matrix-13125 on 7th July 2008.


                                                       16
                             AIDA draft report version 1, 2 May 2009



      I've looked back again to the scores where we have fudged slightly by
      assessing ourselves as being at two stages simultaneously. Here is a little
      more background on the double scored elements.
      For Asset continuity, one of our respondents considered that they were
      between stage 2 and 3. At a corporate level there has been a lot of effort
      in risk management and business continuity at a strategic level which is
      now being drilled down operationally. Some areas have operational
      business continuity plans for their vital digital assets in place (the ones in
      current use that are needed to keep the business running) but we are a
      long way from applying a coherent and consistent strategy to preserve
      those digital assets that are worth keeping for 10, 20 and 100 years.
      For appropriate technologies, our response reflects local variations on
      the global picture. Considering this at an institutional level colleagues
      and I put us at stage 2; colleagues in one of our academic Schools
      assessed the School as being at stage 3.
Creation of the Departmental tier involved simply adding an additional row of
Indicators and Exemplars to each element of the toolkit, and adapting some of the
wording. The second tier was assumed to apply to individual asset collections, or an
individual department.
The questionnaire form was accordingly updated also.
Self-help guide implementation
The self-help guide is a list of actions and available tools, designed to help an
institution move forward along the five stages.
This document evolved in three distinct phases:
Phase one: the design of a basic template with four main "clusters" of suggested
actions. We decided there would be no point in advising an institution how to move
forward to Stage One (from Stage Zero?), as to do so is equivalent to acknowledging
there's a problem. By definition, if you are using the toolkit, you have acknowledged
the problem, and are therefore at stage 1.
We revisited the wording of the Cornell stages, and looked again at toolkit thus far.
We looked for verbs and actions everywhere - the assessment toolkit describes where
you're at, where the self-help guide has to make suggestions for what to do.
Phase two: following the decision to redesign the Toolkit as a two-tier system, we re-
designed the Self-Help guide to match. We added a second column of Departmental
actions; some of these mostly repeat the institutional ones, and sometimes are not
applicable.
Phase three: we added a list of available tools - and tried to position them at
appropriate places across the levels. These tools were taken from DPC, JISC, DPE
and other places. A "tool" in this context can mean many things: a piece of software, a
standard, written guidance, training, a mailing list, a forum, a website, or a
publication.




                                               17
                                      AIDA draft report version 1, 2 May 2009



The tools were taken by selecting resources from the following published lists and
inventories 10:
            1. The Digital Curation Centre (DCC) list of tools and standards
            http://www.dcc.ac.uk/tools/
            Especially the catalogue of digital curation tools
            http://www.dcc.ac.uk/tools/digital-curation-tools/

            2. The DPC Handbook, originally compiled by Beagrie and Jones, and now
            maintained online by the Digital Preservation Coalition:
            http://www.dpconline.org/graphics/handbook/

            3. Digital Preservation Europe's Registry of online resources:
            http://www.digitalpreservationeurope.eu/registries/resources/
            And the other DPE registries:
            http://www.digitalpreservationeurope.eu/registries/

            4. List of available JISC services:
            http://www.jisc.ac.uk/home/whatwedo/services.aspx

            5: JISC projects in the DP and Records Management programme
            http://www.jisc.ac.uk/whatwedo/programmes/preservation.aspx

            6. A published description of the JISC DP programme, 'Supporting Digital
            Preservation and Asset Management in Institutions' by Leona Carpenter
            http://www.ariadne.ac.uk/issue43/carpenter/

This exercise came from the notion that even if AIDA is not offering anything new, it
can add value to the community by making some sense of the multitude of tools that
are currently available in the world of digital asset management and digital
preservation. The task turned out to be something of a jigsaw puzzle (and almost as
subjective as the toolkit itself), but some of the tools clearly had "obvious" homes,
such as those format identification tools assigned to the "Obsolescence" element in
the Technology leg. The exercise also showed that tools can be used for many
purposes, applying at different elements and even at different stages, depending on the
problems they had to help with.
The final version of the guide has an Appendix of all tools cited in the main body of
the guide. The Appendix includes an indicator suggesting the minimal stage at which
an institution must be positioned, before the tool starts to become useful to them.
Since the spreadsheet is sortable by Legs and Stages, it would be possible to perform
a simple filter action to determine all tools which apply at Stage 2 of the Organisation
leg, for example.
Problems and issues
Identifying and recruiting suitable case studies proved to be one of the more time-
consuming parts of AIDA. There was a lack of clarity as to who we should approach
in the first place, since often there was no clearly dedicated asset manager role. There
was also a timing issue; unsurprisingly, few institutions had spare time or staff

10
     All the URLs in the list below were consulted on 6th March 2009.


                                                         18
                            AIDA draft report version 1, 2 May 2009



available to devote to the project, and a common question before they even agreed to
participate was 'how much time will this take me, how much is it going to cost, and
when do you want the results?'
The next issue was, ironically, the Cornell five stages model. The three legs model
stood up well (although one of the respondents questioned its value; see Appendix 3),
but it became increasingly awkward to map the five stages against all of the elements
we had devised. This is particularly so in the case of technological infrastructure,
where quite often the case is that the institution either has the capability (software,
hardware, expertise) or it doesn't; it is not always easy or clear-cut how to express
degrees of success with, for example, a procurement rollout. Five stages has many
strengths, but it could equally be seen as a somewhat stilted way of looking at asset
management.
The "available tools" exercise for the self-help guide was completed quite late in the
life of the project, and should not be seen as anything definitive; but for some
institutions, it may help to start narrowing the field of choice when it comes to
selecting the right tool for the job. This is an area which we see as ripe for further
work and for community input – as well as input from tool developers themselves.




                                              19
                             AIDA draft report version 1, 2 May 2009




Outputs and Results
The AIDA self-assessment toolkit has been the main deliverable and output of this
project. It exists as a 59pp Word Document. In its final incarnation, it has the
following elements:
      An introduction explaining how to use the tool
      A definition of what digital assets are
      A definition of digital asset management
      A list of FAQs to stave off certain reservations that users may have about
       undertaking the process
      A three-legged Assessment process, subdivided into auditable elements
       (which are numbered for ease of reference)
      Explanatory notes for each auditable element
      A description of the level of implementation for each element against the five
       stages
      Detailed indicators and exemplars for the level of implementation, which are
       expressed in one band which covers the entire institution, and a second band
       for an individual department or asset collection
      Real-life quotes from respondents embedded alongside the indicators, to give
       anecdotal examples and thus increase the chances of "instant recognition" by
       respondents
      A short list of possible supporting sources for each auditable element, i.e.
       documentary and record sources inside the institution which could be used as
       prompts for providing an answer
      An appendix describing the Cornell five stages and how they apply to digital
       assets
As a separate document, we built the Assessment Scorecard (also as a Word
Document), whose opening page gathers important profile information from the
participating institution. The profile data includes contact information, questions
about the type of institution, the scope of the assets in the survey, and the job roles of
those contributing to the assessment.
We feel that this incarnation of the toolkit works well, but could benefit from further
development. The AIDA project manager felt strong on devising the Organisation leg,
but lacked the skillset to devise the Technology leg (for which help was recruited
from an IT infrastructure manager inhouse). One of the toolkit's strengths is its
structure which was adapted from the Trust DR models. The two-tier approach hasn't
been tested yet, but this solution which arose in the life of the project does directly
address one of the key reservations that was raised by more than one respondent in the
case studies.
The toolkit worked satisfactorily with most of the case participants, who were able to
recognise themselves for a large number of the auditable elements. (One unusual
exception to this was Organisation element 09 on contractual agreements; maybe
institutions don't use them that much.)


                                               20
                            AIDA draft report version 1, 2 May 2009



The weighted scorecard's evolution is an example of something that started as
something complicated, and became very simple through a process of refinement, as it
became clearer what was needed. This is not the usual path for such tools, which have
a bad habit of acquiring complexity during the life of a project, but it was a welcome
development. The weighting was and is somewhat subjective. The provisional scores
that were created for each institution (and sent to them in February 2009) were
supplemented with a simple graphic chart in Excel, so that each respondent could see
their overall score for each leg against the control score.
The toolkit was sent out with a scorecard in Word using tick boxes; we recoded the
results directly into our weighted spreadsheet. Since the completion of that exercise,
we have since devised a crude but effective method of (a) doing this online and (b)
integrating the responses directly into the weighted scorecard. This automated version
has been done using Google docs; a prototype can be seen at
http://aida.jiscinvolve.org/online-toolkit/. There are probably more sophisticated
methods of delivering the questionnaires, and clearly it could be more joined-up, but
this is a start. This feature is something ULCC would like to develop to make into
something more sophisticated. In particular, we feel there is potential value in
allowing institutions to compare their individual scores against aggregate scores for
similar institutions. This is an approach that is used with some success in exercises
such as the National Students Survey.
The self-help guide is not quite completed, but it has potential for developing into an
extremely useful tool. It has not yet been circulated to case respondents, to test its
usefulness. The addition of the available tools has been a most useful exercise. The
self-help guide suffered during much of its development from the same problem as
the toolkit (see above), in that the AIDA project manager felt strong on devising the
Organisation leg, but lacked the skillset to fully develop the Technology leg. Some
elements are currently left blank, with no suggestions yet made for self-help actions.
AIDA received feedback and responses from institutions, both on the general
process of completing the assessment, and on specific elements in the toolkit. The
specific comments on each stage and each leg have been fed back, in anonymised
form, into the structure of the toolkit. The general feedback on the process is attached
to this Report as Appendix 3.




                                              21
                             AIDA draft report version 1, 2 May 2009




Outcomes
The table below summarises AIDA's achievements against the original aims and
objectives of the project.
Aims and Objectives                                 Project achievements
The central aim was the production of an            Done.
assessment toolkit, to enable institutions
to perform self-assessment of their
readiness for digital asset management.
The tookit was to be tested internally at           Done
ULCC, in the institutions, and externally.

Brief fact-finding stage, to gain some              No fact-finding stage.
familiarity with the sorts of digital assets
being created in institutions and to gain
knowledge of their requirements.
Recruitment of case study test sites to             Done; five case study test sites in all
develop and assess the toolkit.

Produce case studies from these testing             We did not receive sufficient feedback or
operations.                                         content from the participating institutions
                                                    to produce meaningful case written
                                                    studies, but see Appendix 3


Self-help guide with suggestions and                Done; this was expanded to include a list
recommendations about moving forward,               of available tools.
and how to change
A decision making-tool which performs               AIDA did not develop a decision-making
digital asset management. This was an               tool.
ambitious part of the initial plan (section
3.3); the tool was intended to "allow for
selection and appraisal decisions,
accommodate workflow schedules, and
incorporate an understanding of the
current and future management and use
of the assets."
Collaboration with LIFE, Planets and                Formal collaborations were not possible,
Caspar. Cost modelling was supposed to              but we have had strong links with the
be one component also; the idea that the            LIFE project and its successors.
cost of managing assets could be
measured over time, and future costs
could be predicted.


Dissemination and communication                     Done. We'd also like to automate the
through a website.                                  assessment and scoring process, allowing


                                               22
                            AIDA draft report version 1, 2 May 2009



                                                   it to be completed online and remotely,
                                                   and to aggregate statistics.


Workshop to present the toolkit.                   To be done



AIDA's achievement has not been to provide anything especially innovative in the
field of assessment approaches, nor in the field of digital asset management, but it has
hopefully made the concepts more approachable and understandable to a wider range
of professionals. The project will therefore be of benefit to information professionals
who have heretofore felt disenfranchised from the process of digital asset
management. The core intention of this project was for AIDA's outputs to encourage
institutions to share good asset-management practice across disciplinary boundaries.
The project has made a start in sorting out the multitude of available tools and
standards that might assist asset management. Some professionals in the DP world
can feel overwhelmed by the choice; we have put some of them into a structured
framework, indicating when and how they might apply. We believe that further
development of this work could be a particularly useful outcome of AIDA,
independent of the long-term success or otherwise of other aspects of the toolkit. If
nothing else, such assessments can encourage tool developers to think harder about
the potential audience for their tools and the pre-requisites for their use.
Our methodology may have benefited from more fact-finding, practical case work and
real-world examples. The results have involved too much copying from known
sources, with not enough actual study or doing; and it's based too much on anecdotal
evidence, not on experience.




                                              23
                            AIDA draft report version 1, 2 May 2009




Conclusions
Providing a toolkit to allow institutions, or parts of institutions, to assess their
strengths and weaknesses in digital asset management against an external, defined
scale has benefits in helping them understand what their next actions should be. Using
anecdotal quotes to define the levels of maturity as well as abstract descriptions
improves the usability of such a self-assessment tool, as it makes it easier for those
using it to recognise themselves and their institutions by analogy with others.
Asset management is unlikely to be the responsibility of one person or one group.
Indeed, the scale used by AIDA implies that institution-wide management (or even
cross-institutional management) only emerges at the very highest stages of the
maturity model. Experience with this model as it applies to digital preservation
suggests that even the most experienced institutions have yet to reach this stage in all
respects. But this fragmentation can mean it is difficult to use the tool to fully reflect
all aspects of asset management in an institution, since the knowledge to use it is
likely to be fragmented. Nonetheless, the tool can still be used effectively. One way to
draw out information from another part of an institution is to publish an assessment
based on partial knowledge. Those who disagree with its findings will rapidly emerge
to contradict them!
In retrospect, we realise that we should have aimed to have a draft of the self-help
guide available earlier in the project so that we could have had external feedback on
it. We should also have assigned relevant IT infrastructure management expertise to
the project at an earlier stage.




                                              24
                             AIDA draft report version 1, 2 May 2009




Implications
There remains some web development work to produce an interactive version of the
toolkit, which ULCC need to schedule in the coming months to maintain momentum.
Digital asset management is currently a topic of great importance amongst many
institutions and the online tool will be useful as a promotional device.
The interactive version could, over time, accumulate anonymised scores and start to
build aggregated figures, giving some sort of picture of the national situation with
regard to asset management. The original NPO Preservation Assessment Survey did
just this for archives, libraries and museums, and it was part of DAAT's remit to
emulate this aspect of their strategy.
The self-assessment tool, the self-help guide, and the matching of asset management
tools to institutional or project maturity levels all have independent utility. Even if one
component lacks long-term value, the others may still possess it.
The AIDA self-help guide can be seen as a beginning rather than an end, the first step
in a process that should develop into a detailed assessment of tools. As the DCC
website states, "It is hardly surprising that many of you are experiencing tools and
standards fatigue: a sense of confusion about what tools and standards exist, how
they apply, how their costs and benefits stack up, and how they relate. Most studies of
what tools and standards exist often leave you unsure of how to proceed. For their
part, many tools and standards are poorly linked, inconsistently used, and not always
clear about their intended application."
The intention is to develop the AIDA self-help guide from a simple spreadsheet table
into a database, and a wiki version which would enable comment and feedback from
the DP community, especially the authors and creators of the tools. Such an initiative
would tie AIDA in with other inventories and testbeds of tools that are currently being
carried out, for example by the PLANETS project.




                                               25
                            AIDA draft report version 1, 2 May 2009




Recommendations
1 Consider use of the assessment tool to gain a greater understanding of digital
asset management strengths and weaknesses in your institution or project.
2 If use of the tool at institution level is too problematic, consider it for a project
or service such as an institutional repository.
3 Use the self-assessment tool’s results and the self-help guide to help improve
practice in your institution.
4 Feed back your observations on the self-assessment toolkit to the AIDA project
team at ULCC; they may be helpful for future users.
5 Tool developers should consider using the model’s maturity levels to provide a
description of the likely audience for their tools.




                                              26
                                 AIDA draft report version 1, 2 May 2009




References
Borysowich, Craig, 'Constructing a Weighted Matrix' (2006)
From http://it.toolbox.com/blogs/enterprise-solutions/constructing-a-weighted-matrix-13125
The Center for Research Laboratories (CRL) and Online Computer Library Center Inc (OCLC),
Trustworthy Repositories Audit and Certification (TRAC): Criteria and Checklist
http://www.crl.edu/content.asp?l1=13&l2=58&l3=162&l4=91
Consultative Committee for Space Data Systems (CCSDS)
OAIS: Reference model for an Open Archival Information System (2002)
http://public.ccsds.org/publications/archive/650x0b1.pdf
Digital Asset Assessment Tool (DAAT) Final Report, 13 October 2006
http://www.jisc.ac.uk/publications/publications/daatfinalreport.aspx
Digital Preservation Training Programme (DPTP)
http://www.ulcc.ac.uk/dptp/about-dptp.html
Jones, Ross, Ruusalepp, Data Audit Framework Methodology version 1.7
HATII, University of Glasgow (2009)
Kenney, Anne and McGovern, Nancy: the Cornell models (not really published as such)
Kenney, Anne. The Cornell University Survey of Institutional Readiness.
A Managed Learning Environment Integration Matrix.
From the Scottish Funding Council E-Learning Implementation Guide. (Tool 1A)
From http://trustdr.ulster.ac.uk/outputs.php
National Preservation Office, The Preservation Assessment Survey
http://www.bl.uk/npo/paslib.html
Network of Expertise in long-term STORage (NESTOR)
http://www.langzeitarchivierung.de/index.php?newlang=eng
Proven, Jackie, Casey, John and Dripps, David.
Analysis and audit tool for rights management in learning object repositories
University of Ulster and UHI Millennium Institute. (Tool 2B)
From http://trustdr.ulster.ac.uk/outputs.php
Ross, Seamus and McHugh, Andrew, 'The Role of Evidence in Establishing Trust in Repositories'
D-Lib magazine, July-August 2006
http://www.dlib.org/dlib/july06/ross/07ross.html
Vernon, R. David and Rieger, Oya Y., 'Digital Asset Management: An Introduction to Key Issues'
Cornell University, Office of Information Technologies, 2002
http://www2.cit.cornell.edu/oit/Arch-Init/DigAssetMgmt.pdf




                                                   27
                           AIDA draft report version 1, 2 May 2009




Appendixes
APPENDIX 1: Consolidation List - a stage in production of the toolkit
APPENDIX 2: Call for Volunteers message
APPENDIX 3: Compilation of feedback from case studies


Deliverables are:
      Toolkit
      Questionnaire
      Online questionnaire
      Blank weighted scorecard
      Self-help guide
      This final report
      The website




                                             28
                                     AIDA draft report version 1, 2 May 2009




APPENDIX 1
Early stage in development of the assessment toolkit: working with four separate
sources, we underwent a process of comparison to find areas of commonality among
the statements, questions and elements found in these documents. The tables below
expose this comparison process.
                                                ORGANISATION
Element                 TDR checklist           Cornell readiness       RLG-OCLC               NESTOR
                                                survey                  summary
Mission statement       Reflects                Supports long-term      Evidence of            The DR has defined
                        commitment to DP        commitment to DP        fundamental            its goals.
                                                                        commitment to
                                                                        long-term retention
Contingency plan        Formal succession                               Risk management,       The DR engages in
                        plan                                            contingency and        long-term planning.
                                                                        succession planning
Policies and                                    Policies in place for   Establish effective    The DR has
procedures                                      long-term access,       management             developed criteria
                                                selection,              policies               for the selection of
                                                acquisition, quality                           its digital objects.
                                                creation, transfer,     Adopt appropriate
                                                and preservation        preservation
                                                strategy                strategies

Policy review           Mechanism for                                   Review and
                        review, update and                              maintain policies
                        development                                     and procedures
Policy                                          Policies should be      Enact all policies
implementation and                              vetted by senior        and procedures for
authority                                       management and          specified functions
                                                implemented
Monitoring and          Continued operation                             Establish
feedback                assured                                         monitoring
                                                                        mechanisms to
                                                                        ensure continued
                                                                        operation
Review and              Policies should be                              External experts for
assessment              reviewed and                                    validation of
                        maintained                                      processes
History / audit trail   Documented history                              Document all
                        of changes to                                   practices
                        operations, etc
                                                                        Record and justify
                                                                        preservation
                                                                        strategies
Transparency            Accountability in all                           Commit to
                        actions                                         transparency in all
                                                                        actions
Deposit agreements      Appropriate             Comprehensive           Define written         Legal contracts
                        contracts /             deposit guidelines      agreements with        exist between
                        agreements are          and written transfer    depositors             producers and the
                        present and             requirements                                   DR.
                        maintained
Digital rights          Specified                                       Have appropriate       In carrying out its
                        appropriate                                     legal status           archiving tasks, the
                        preservation rights                                                    DR acts on the basis
                        as needed                                                              of legal rulings.



                                                         29
                                       AIDA draft report version 1, 2 May 2009



Copyright and IPR     Tracked and                                                             Legal and
                      managed as                                                              contractual rules are
                      required by contract                                                    observed
Liability             Policy for liability if
                      needed



                                                  TECHNOLOGY
Element               TDR checklist             Cornell readiness      RLG-OCLC               NESTOR
                                                survey                 summary
Technological         Repository OS is          Technological          Appropriate            The IT
infrastructure        well-supported            infrastructure is      infrastructure for     infrastructure is
                                                adequate to sustain    acquisition, storage   adequate.
                                                DP                     and access
Backup                Backup function for       Storage program        Policy for backups
                      all services and data     includes backup and
                                                offsite storage for
                                                backups
Location of copies    Number and                                       Policy for copying
                      location of all
                      copies is known
Synchronisation       Multiple copies are                              Policy for
                      synched                                          authentication
Data corruption       Mechanisms for            Storage program        Processes for
                      corruption and loss       includes media         detection, avoiding
                                                testing program        and repairing loss
Migration process     Storage media             Storage program
                      migration process         includes media
                      defined                   refreshing and
                                                migration program
Changes to critical   Documented change                                Process to notify
processes             management                                       about changes and
                      process                                          resulting actions
Appropriate           Hardware and                                     Appropriate
technologies          software                                         infrastructure for
                                                                       acquisition, storage
                                                                       and access
Security -            Systematic analysis       Storage program        Assure security of     The IT
environment           of the information        includes access-       systems                infrastructure
analysis              environment               controlled area for                           implements the
                                                storage media                                 security demands of
                                                                                              the IT security
                                                                                              system.
Mechanisms for        Processes to address      Security and other     Policy for firewall
security              defined security          mechanisms in
                      needs                     place
Disaster recovery     Written disaster          Storage program        Policy for disaster
                      plan                      includes disaster      preparedness,
                                                recovery plan          response and
                      Plan is tested                                   recovery
                      Process for disaster
                      recovery
Technology                                      Dedicated funds for    Establish policy for
management                                      technology             replacement,
                                                development,           enhancement and
                                                replacement and        funding
                                                upgrades



                                                         30
                                   AIDA draft report version 1, 2 May 2009



External audits                                                      External audits on
                                                                     system components
                                                                     and performance
Digital collections                           List of digital
                                              objects types (e.g.
                                              images, email, word
                                              processing,
                                              websites)
File storage used                             List of file storage
                                              types (e.g. online,
                                              tape, CD)
Obsolescence                                  Digital materials
                                              that cannot be
                                              mounted, read or
                                              accessed
                                              Actions taken for
                                              obsolete file
                                              formats, storage
                                              media, storage
                                              drives, hardware
                                              and software
Depository                                    Establish digital
                                              depository
                                              arrangements for
                                              managing collection
                                              - e.g. in-house,
                                              outsourced, third
                                              party, consortium



                                             FINANCE/ RESOURCE
Element               TDR checklist           Cornell readiness      RLG-OCLC              NESTOR
                                              survey                 summary
Business planning     Process in place to                            Good / transparent
process               support                                        business practices
                      sustainability
Review of business    Process to review                              Maintain business
plans                 and adjust business                            plan
                      plan
Transparency /        Practices are                                  Auditable business
auditability          transparent and                                plan
                      compliant
Risk analysis         Ongoing                                        Undertake risk
                      commitment to risk                             management,
                      analysis                                       contingency and
                                                                     succession planning
Funding               Funding gap is          Sustainable funding    Maintain budget       Adequate financing
                      recognised              dedicated for long-    and reserves          of the DR is
                                              term maintenance                             secured.
                                              Dedicated funds for
                                              technology
                                              development,
                                              replacement and
                                              upgrades
Staff skills          Repository staff        Dedicated staff for    Have appropriate      Appropriately
                      have appropriate        DP; organisational     staff                 qualified staff are
                      skills and expertise    expertise; technical                         available.
                                              expertise



                                                       31
                                 AIDA draft report version 1, 2 May 2009



Staff numbers       Appropriate            How many                                     Sufficient numbers
                    numbers to support     dedicated staff / list                       of staff are
                    repository functions   their titles                                 available.
Staff development   Commitment to          Adequate support         Have appropriate
                    professional           for staff training in    professional
                    development /          DP                       development and
                    currency of skills                              training policy
                    and expertise
External funding                           Use outside sources      Actively seek
                                           of expertise for DP      potential funding
                                           (e.g. consultants and    sources
                                           contractors)




                                                     32
                                  AIDA draft report version 1, 2 May 2009




APPENDIX 2
Copy of message sent out in June 2008 as part of the "recruitment drive" to enlist
participants as case studies.


AIDA: call for volunteers
How safe are your digital assets? Do you think you know all about your digital assets? Would you like
to understand more about how to improve digital asset management in your organisation?
ULCC are currently leading a project (sponsored by JISC) called Assessing Institutional Digital Assets,
or AIDA. We’re looking for institutions in the HFE sector in the United Kingdom who would like to
help us, by participating as a case study for this project.
The idea is that you would complete a guided self-assessment task which we hope will make things
clearer in relation to you and your digital assets. We plan to do this around June-July 2008.
For this, we have drafted a self-assessment toolkit which would help determine your institution’s
current capacity for digital asset management. It will help you assess your institution’s ability /
readiness for digital asset management. Based on that assessment of readiness and maturity, later
project outputs will provide recommendations on appropriate steps to take to improve digital asset
management for you. (We’re approaching different institutions who are likely to be at different stages
of maturity). The toolkit can be found at http://aida.jiscinvolve.org/toolkit/.
The process and the outputs of this project may be of some benefit to you. We think that the tools,
guidance and case studies will help institutions understand how to take small steps forward to improve
institutional maturity in regard to digital asset management and preservation concerns.
We’re looking to work with the following information experts: records managers, librarians, digital
librarians, data curators, repository managers, information managers, digital asset managers, web
masters, archivists, and others. Our guess is that there is no single person in the institution who can do
the entire self-assessment, so it may turn into a team effort. There is also the possibility of on-site
support or remote support from ULCC. We’re able to provide some financial support, via our JISC
funding, to a small number of case-study sites.
If you’re interested in participating, we are looking at starting around June or July 2008, depending on
availability of yourself and your staff. We expect the work to take an absolute maximum of eight days,
but our hope and expectation is that it will be less for many institutions. At this stage, we are looking
for participation from UK Higher Education institutions only, although comments from others are
welcome.
Further information is available at our project website.
Contacts: Ed Pinsent (e.pinsent@ulcc.ac.uk) / Patricia Sleeman (p.sleeman@ulcc.ac.uk)
                                                                                              11 June 2008




                                                    33
                            AIDA draft report version 1, 2 May 2009




APPENDIX 3
A compendium of general feedback on the AIDA process.
The AIDA process was…
     Not too time consuming.
Was it useful?
     Helpful to have the visit, would not have understood otherwise (could
     probably have done the whole thing then as a kind of interview. Possibly
     you might have got more take up if you asked for three hours and then a
     bit of follow up rather than having to fill in independently?).
     We will find the self-assessment of our capacity, state of readiness, and
     overall capability for digital asset management very useful and look
     forward to receiving your feedback and guidance on developing an
     institution-wide approach to asset management.
     I found it useful just because it collected all this information in the one
     place. However, it didn’t tell me anything I didn’t already know. [Our]
     risk review (carried out last year) covered pretty much the same ground.
     The toolkit could potentially be used as part of a push for more resources,
     but [our] Programme has already mapped out its priorities for the coming
     years.
     [We] would have preferred to have the descriptive information in the
     same document as the form, rather than have to flip between two separate
     documents. [This is] no reflection on the content and intent of the process,
     which we found useful.
Were the terms clear?
     The terminology is sometimes confusing but generally the guidance notes
     explained it well.
     Although there is a description of what is intended by Digital Asset, this is
     difficult to apply when doing the exercise. Although not every e-mail for
     example is a digital asset, the overall information contained within the e-
     mail systems would be critical if lost, and so could be considered an asset
     from a vital records perspective. If anyone does not have the information
     to do their jobs, this reduces the output of the organisation. It would
     almost be better to ask institutions to list the digital assets that they
     believe they have, to ensure understanding of the terminology.
     It would have been very helpful to have had more definitions of the terms
     used in the exercise.
     Some of the questions referred directly to Digital Assets, and yet others
     did not. This is confusing particularly in light of our comments re
     scoping.
Was the scope of the toolkit applicable?
     It is difficult to fill in for the University as a whole as there is so much
     variation.


                                              34
                           AIDA draft report version 1, 2 May 2009



     We were expecting more specific questions on things like EDMS and
     fileplans, research data etc.
     The scoping in particular we found problematic - it is hard to build a
     representative picture based on a narrow area of focus.
     Although the starting point for the toolkit is to consider a scoped down
     section of the organisation, the questions make it very difficult to apply
     this thoroughly and still provide a representative view of the status of the
     organisation. In a University, it is generally the case that policies are
     produced centrally and then implemented in each department, for
     example. However the intention is for the scope of responses to be
     narrower than this. In our exercise, although our starting point was the
     College Headquarters, we had to unofficially widen our scope to include
     our ICT, Library, Staff and Student teams, Research, Communications,
     and the balance of the departments themselves, to try and accurately
     reflect the big picture. The reverse of this is that where one team may be
     excelling, others may have other priorities
     The toolkit seems to try to consider digital assets as structured and
     organised separately away from other outputs and work. This does not
     reflect the way in which this organisation at least has adapted and grown
     over the years. The majority of systems are parallel paper and digital,
     and are likely to remain so for some time. The policies, procedures and
     practices in place here reflect this, however the toolkit does not give the
     opportunity to express this easily in responses.
Was the structure helpful?
     There was some repetition across the three areas, in fact does it need to
     be split into organisation/technology/resources? We did not find this
     helpful.
     We found the questionnaire to be repetitive in many ways; while
     understanding the need to get accurate data, some areas were covered
     many times.
     We thought it might work as a questionnaire with more specific questions
     lifted from the indicators and that this might make comparisons between
     institutions easier.
     In some questions, the progression through stages 1 to 5 seemed to be out
     of sequence as we would approach it, or the jumps from one stage to the
     next seemed very large
Could you recognise yourself?
     As mentioned on the phone, for some questions, we had problems in
     deciding whether we were responding on behalf of an area or the digital
     repository or on behalf of the University. Where there has been concern I
     have noted that we have reached a certain stage for the digital repository
     and in fact the University might be at a stage below.
     I've looked back again to the scores where we have fudged slightly by
     assessing ourselves as being at two stages simultaneously.



                                             35
                       AIDA draft report version 1, 2 May 2009



We could recognise ourselves although it needed at least two [people] to
come to a decision and even then we felt we did not always have enough
information.
We found it quite a subjective process.
...it is extremely difficult to consider digital assets (and perhaps we still do
not fully understand the definition of these) in isolation; and we have been
trying to respond to the specific points as written out in each section of
the toolkit and where some of them mention digital assets and some do not
and some mention policies relating to digital assets and some do not, it is
difficult to be consistent.




                                         36

				
DOCUMENT INFO
Shared By:
Categories:
Stats:
views:24
posted:7/16/2011
language:English
pages:36