A FRAMEWORK FOR COMPUTER-ASSISTED ASSESSMENT
THROUGHOUT THE UNIVERSITY OF GLASGOW
Principal Investigator: R Thomson, Mechanical Engineering,
Co-investigators: W Stewart, Civil Engineering, E Yao, Physics and Astronomy
The aim is to develop a Moodle-based framework for a bank of objective test items that can be both
populated and plundered by specialists in all academic disciplines. This will improve feedback to
students and their teachers, smooth the induction of new students at all levels and improve retention.
The project will emphasise the need for item analysis as part of the quality-assurance process that is
needed to validate test items prior to their use for 'live' assessment. Outcomes will include:
an on-line Users’ Manual, available to all University staff to assist in the design and
implementation of objective tests.
a small bank of model questions, initially focussed on mechanics, a topic that is taught in
several Faculties and at all levels. These questions will provide templates, illustrating
Moodle features that will transfer across disciplines.
a list of desirable features that Moodle currently lacks and, possibly, the coding to implement
The work will build on past experience of objective testing both in the University and elsewhere,
including that of the principal investigator, a trained teacher.
The project will begin with an expert-led seminar, open to all University staff, followed by a phase in
which two undergraduates from each of the three participating departments will be engaged to
generate objective test items. During this phase, selected classes will be exposed to the individual test
items. This will allow the items to be evaluated quantitatively, by item analysis, and qualitatively, via
on-line means and face-to-face focus groups. Throughout the project, a part-time research assistant
(RA) will coordinate the activities of the students and participating academic staff.
In the second year of the project, test items that meet the appropriate quality standards will be
assembled into complete quizzes and used for formative assessment of selected classes. The project
will conclude with an evaluation of the value of the quizzes and a look forward to their use for
Aims and outcomes
The aim is to develop a Moodle-based framework for a bank of objective test items that can be both
populated and plundered by specialists in all academic disciplines. The use of such items in Moodle
quizzes will provide immediate, response-dependent feedback to students as well as diagnostic
feedback to staff. This information will smooth the induction of new students at all levels including
taught postgraduate. It is also expected that it will improve retention by the positive reinforcement of
exam success and by increasing students’ overall engagement with their course.
The item bank will contain tried-and-tested questions suitable for pre-testing and for diagnostic,
formative and summative assessment. The key phrase here is tried and tested, and the study will
emphasise the use of item analysis as part of the quality-assurance process that is needed to validate
test items prior to their use for 'live' assessment.
An outcome of particular use to objective test developers will be an on-line Users’ Manual. This will
outline the theory of objective testing, illustrate the different types of test item and demonstrate item
analysis as a quality-assurance procedure for test items. The manual will also give simple instructions
on how to implement objective tests in Moodle. The use of a wiki format will ensure that the manual
can be expanded and updated by any developer in any discipline.
While the objective is to establish a general framework that will be of use to all disciplines and levels,
the topic of mechanics is adopted as the focus for the current project. Mechanics is taught severally in
a number of Faculties, including Engineering, Physical Sciences, Information and Mathematical
Sciences, and Education. It also spans the range from Level 1 to Taught Postgraduate. Objective test
items for mechanics require embedded mathematics and graphics, and their design will challenge both
the investigators and the Moodle virtual learning environment (VLE).
The item bank will initially be populated with test items specific to the proposers’ own courses.
However, these will be designed to exercise various features of Moodle and so provide templates that
are transferable to other disciplines. Exercising Moodle in this way will also highlight desirable
features that it currently lacks. The open-source nature of Moodle gives the option of in-house
development of new features and subsequent release to the world-wide Moodle community.
While the project will see the use of objective tests for formative assessment, the timeline is
insufficient to allow the use of newly-created items for summative assessment. However, this is an
obvious corollary which offers significant and sustained potential benefits to staff. As a final
outcome, it is hoped that staff in other disciplines will participate in the expansion of the item bank.
Objective tests in several academic departments. Computerised marking of these tests is
commonplace but the use of a VLE such as Moodle streamlines the whole objective testing process
and, in particular, admits immediate, response-dependent feedback to students and staff. The Moodle
'quiz' module has been used thus by several academic departments to create item banks relevant to
their own subjects. However, anecdotal evidence suggests that little or no item analysis, and little
analysis of the quality of the feedback, has been done. As a result, the use of the questions,
particularly for summative assessment, is not defensible.
Sizeable item banks have been built in several University departments and there is a recognised
requirement for a cataloguing system that assigns a unique tag to items on different topics and at
different levels. This is particularly important when authors are prepared to share items throughout
the University community; then the need is for a common tagging system and provision for the
possibility of multiple tags. However, a number of cataloguing systems, each with its strengths and
weaknesses, have been developed for use in libraries. It is expected that one of these will be the basis
for an item tagging system.
The literature of every academic discipline has its own special requirements, which might include
graphics, non-standard fonts and, possibly, animation. In engineering and the physical sciences, a
major requirement is for mathematical notation. Exploratory work by the principal investigator, a
former schoolteacher with experience of objective testing, has produced a small bank of mechanics-
oriented MCQs, cloze questions and simple true/false questions that demonstrate Moodle's ability to
support such special needs. However, none of these items has been subject to any quality assurance
process and so they cannot be rolled-out for use in 'live' assessment.
The theory of objective testing is well known to educationalists while the content to be examined is
well-known to subject specialists. The project requires input from both groups.
The aim of the project is not to provide a fully-populated item bank. Rather, it is to design a
framework for such a bank that will be of use throughout the University and that will encourage
subject specialists to populate it with items of their own, especially if these are transferable to other
disciplines. The analogy is that of a library, seen as a framework for the storage of knowledge in all
subjects. The analogy can be extended via the concept of expandable storage space and a cataloguing
system that can accommodate new fields of knowledge. A key feature of the item bank will be its
sustainability and expandability.
It will, of course, be necessary for the investigators to devise a number of model questions, test items
on which quality assurance tests have been performed. It is recognised that academic disciplines, and
even individual academics, have their own terminology and notation and so these model items must be
designed as templates that can be amended to suit different local requirements. This will ensure
transferability across disciplines and sustainability of the item bank over time.
A key objective is the engagement of students. Students know better than staff what they find
difficult on any course and two students from each of the participating departments will be employed
to ‘kick-start’ the project by generating test questions. This will be followed by a phase in which
selected classes will be exposed to these questions to allow quality assurance testing of each item.
The students initially engaged will be retained part-time during this phase to further populate the item
bank and generate complete quizzes. In the second year of the project, these quizzes will be used for
formative assessment by selected classes. During this phase, all of the students in each class will be
encouraged to devise and submit questions to the investigators. Such submissions will be evaluated
and, if suitable, added to the bank.
The evaluation of test items is a major task and will require a part-time research assistant (RA). The
RA will ensure conformity of the individual items with the agreed model and will advise on question
structure. Each investigator will be responsible for ensuring the accuracy of the responses within
each item, as required by their own courses, but L+T staff will advise on quality-assurance testing and
on the evaluation of student responses. A potential RA, willing to participate in the project, has been
identified. This individual has set up Moodle quizzes in the Arts Faculty but has a background in
education and computer science. She is well-placed to coordinate this project.
The response of student groups to tests will be monitored both quantitatively, via item analysis, and
qualitatively, using face-to-face focus groups and on-line means that might include a wiki, blog or
discussion forum. An L+T staff member, as an independent person, will run the focus groups.
Anonymised responses will be collated. Ethics Committee approval is not required but students will
be asked to sign a consent form. They may opt out of the study.
Potential Applicability and Transferability
It has already been noted that the aim is to provide a general framework for a bank of objective test
items that will be of use throughout the University. The degree to which this is achieved will, of
course, be presented at a future L+T seminar. However the intention is to begin the project with an
expert-led seminar on the design and development of objective tests. The Higher Education Academy
(HEA) will be asked to supply a seminar leader. Subject to resources, this seminar will be opened to
GU staff and, possibly, staff from other institutions. This might encourage the participation of staff in
a range of disciplines.
A number of short research seminars will be given in the second year of the project. These will be
open to all staff and will not only allow dissemination of results but provide feedback to the
investigators on the conduct of the project and the needs of other staff. This, it is hoped, will increase
The results of the project will also be disseminated via the appropriate HEA subject centres.
Any perceived deficiencies or enhancements made to Moodle will be made available to the Moodle
Evaluation has two aspects in this project: evaluation of the individual test items and evaluation of the
results of implementing objective test quizzes on student performance.
Evaluation of the test items requires both quantitative item analysis, which is already implemented in
Moodle, and qualitative feedback from students. The latter can be obtained in a number of ways: a
Moodle wiki or blog and a discussion forum will be run continuously throughout the project to allow
students to comment on the progress of the project. These will be supplemented by monthly focus
groups that will be attended by all staff but run by an L+T staff member acting as an independent
To assess the impact of the quizzes on student performance, it would be scientifically preferable to
divide each class into a 'quizzed' group and a control group. However, this is ethically questionable,
since it would deny some students a potentially useful diagnostic tool and study aid. A simple
comparison of the pass rate for each class with that of the previous year will therefore be made. The
wiki/blog/forum will also be used to elicit students’ qualitative reaction to quizzes.
In Year 1, the objectives are to establish the item-bank framework and populate it with individual
model questions that have been subject to quality-assurance testing on selected classes. Year 2 will
see the expansion of the item bank and the assembly of assemble complete quizzes from items that
have passed the appropriate quality standards. These will be used for formative assessment of
selected classes prior to the questions being rolled out for use throughout the community. The
suitability of the materials for summative assessment will also be explored, prior to their live use in
A full Gantt chart showing detailed tasks and milestones is included as a separate document with this
submission. A brief summary is given in Figure 1.
Sep08-Oct08 Oct08 Nov08 – Apr09 May08-Jun09 Jun09
startup, lit expert-led
evaluate interim report
review, tagging seminar evaluate phase 1
Sep09–Mar10 Apr10–May10 Jun10 Sep10 =>
construct + conduct +
evaluate evaluate phase 2 evaluate
formative tests summative tests
Figure 1. Project timeline.
The major cost of the project will be the salaries of the investigators, estimated at £51688 over the
duration of the project. This will be borne by the participating departments. LTDF support is
required to employ an RA for the 71 working weeks (spanning 89 calendar weeks) of the project. A
suitable RA is currently employed part-time by the University and is costed pro rata at a salary of
£23692 (R+T point 6, £48/h FEC):
U Barrett, staff no 219459 (£48/h):
average 7 h/wk for 35 wk = £11760
average 3.5 h/wk for 36 wk = £6048
It is also proposed to employ six students, full-time for one week at the project start-up, then 3.5 h/wk
for 11 weeks, all at a rate of £7/h, giving a total cost of £3088.
Additional expenditure has been included as follows:
consumables and media: £500
expert seminar cost: £400
A spreadsheet showing the weekly spend profile is included as a separate document with this
submission. A quarterly summary is given in Table 2.
Sep 08 £2689
Oct-Dec 08 £5902
Jan-Mar 09 £3696
Apr-Jun 09 £3360
Jul-Sep 09 £1276
Oct-Dec 09 £2016
Jan-Mar 10 £1848
Apr-Jun 10 £1008
Table 2. Quarterly spend profile
Should this application be successful, I give consent for this application to be published on the
Learning and teaching Centre website.
Signature of Project Leader
Ron D Thomson
Thanks are due to L+T staff, in particular Deneka MacDonald and Mary McCulloch who contributed
to the writing of this proposal and who have indicated their willingness to assist with the project.