A Report on the Assessment Working Party
Background and Summary:
The Assessment Working Party was instituted by the Quality Development
Committee to further consider and implement (where appropriate) the
recommendations of the November 2004 Audit of the College’s policies,
procedures and practices in relation to Section 6 (on assessment) of the QAA
Code of Practice with drafting College Assessment Policy and the revision of the
College Assessment Regulations.
The Assessment Working Party met on three occasions. In addition, the Chair
and other academic staff met on various occasions to discuss relevant issues
and these meetings have contributed to the Working Party’s deliberation and the
development of policy.
In the course of its activity, the Working Party decided that a more
comprehensive mapping of College policy and practice against Section 6 (on
Assessment) of the Code of Practice would be beneficial. The mapping contained
in Appendix A of this Report should be considered as background to the
recommendations and areas for consideration identified by the Assessment
The Working Party considered each of the action points suggested in the
Assessment Audit Report (November 2004) that were within its remit. In the
section marked Recommendations below, detailed responses to each point are
set out together with 11 recommendations (in bold) for the Quality Development
Committee. There are some areas where the Assessment Working Party has so
far been unable to reach consensus on the specifics of the recommendation or
believe that the final decision is best debated in full Committee. These are
identified for the Committee in the section entitled Considerations.
A draft College Assessment Policy is set out in Appendix B. This is a brief
document setting out the purposes of assessment at the College (i.e. what it is
for) and the principles that should underpin its design, approval, conduct,
recording, monitoring and review (i.e. how it should be done). It is purposefully
succinct so that it can be issued to all staff but contains pointers to other key
documents and sources of advice. Appendix C contains draft revised
Assessment Regulations for BA, BSc. Sister documents will be derived from
these for our foundation degrees and also our postgraduate courses.
Regulations are the rules that students must abide by in summative assessment.
These need to be supported by Procedures (what staff must do in assessing
students) and Guidelines of Good Practice (the best approaches). The review
of existing examples and the development of a comprehensive suite of such
documents are ongoing.
Quality Team Final draft 1
The recommendations of the Working Party, in response to the Assessment
Audit Action Points (italicised), are as follows:
a) Ensure that the new assessment policy promotes clearly the importance of
alternative assessment methods and the need to highlight these in course
documentation in order to support equity.
The principle that assessment ‘shall be fair, equitable and take account of the
diversity of the student body’ should be embraced by the College and has
incorporated into the draft College Assessment Policy [see Appendix B].
The implications of this principle are that we should anticipate the needs of
disabled students and indeed the Disability Discrimination Act (2003) requires
that the College does this. It follows that in specifying modes of assessment in
validation or in drafting project briefs, staff should be encouraged to consider
carefully whether the assessment methods may set up any unnecessary barriers
for students. An example of this is the barrier which a written examination might
pose for dyslexic students when achievement of the learning outcomes might be
demonstrated in a valid and rigorous fashion by other modes of assessment.
It is considered good practice across the HE sector to allow students to choose
between a number of assessment methods (for instance the same learning
outcomes might be met by an essay and a visual presentation on a unit) where
possible. However, it is not always possible to specify alternative assessment
methods in the unit specification particularly given the project-based nature of
much assessment in the College. It may be that alternative methods of
assessment have to be specified within the project brief (as long as those
methods fall broadly within the assessment requirements validated in the unit
specification). It is unlikely that course teams will always be able to predict the
diversity of student needs (in terms of disability) which may need to be
accommodated and to specify in advance each and every kind of ‘equivalent’
assessment that might be appropriate to a particular case. There will always be
a need to consider individual accommodations of assessment requirements. In
some cases, additional support in meeting the validated assessment
requirements may be more appropriate than the accommodation. The latter will
certainly be the case where there are constraints on the modes of assessment
appropriate to particular learning outcomes.
On balance, the Working Party considers that while the issue identified by the
Assessment Audit should be addressed (see Recommendation 1), there is
greater need for clarity about the rules governing alternative assessment
arrangements. Although, there is currently a Procedure governing special
arrangements for unseen examinations (a relatively uncommon mode of
assessment at the College), there is no parallel procedure for other assessments.
Staff have expressed a desire for increased support and guidance in devising
alternative assessment requirements (beyond the procedural level). In
responding to the action point, the Working Party has therefore identified an
Quality Team Final draft 2
additional area of work that cannot be responded to within the current timeframe
for the Working Party (see Recommendation 2).
That the Quality Team develop and distribute guidance to staff for use in
revalidation which require that:
i) staff consider carefully whether any particular mode of assessment
(unit assessment requirement) sets up an unjustified barrier for
particular groups of students;
ii) in order to accommodate the diversity of the student body, unit
specifications specify at least one alternative assessment methods
(i.e. choices between assessment requirement) where this is
iii) a standard statement be incorporated in the assessment
requirements section of unit specifications (and project briefs) along
“Alternative assessment arrangements may be made or additional
learning support arranged for students with disabilities or medical
conditions which would impair their performance in meeting the
above requirements and who have registered in advance with
That the Registry, the Quality Team and Student Support should:
i) Review the current Arrangements for Unseen Examinations for
Students with Disabilities and incorporate in a more comprehensive
Special Assessment Arrangements Procedure to govern
arrangements for alternative assessment which ensure their validity
ii) Develop Guidelines of Good Practice to support staff in devising
alternative assessments and accommodating assessment for
students with disabilities.
b) Initiate an initial assessment process for students on first degrees to
ensure that learning support needs are met quickly and efficiently.
The Assessment Working Party endorses this recommendation and has followed
closely the work of the Personal Transferable Skills Working Group in
investigating appropriate testing methods. It is recognised that compulsory
screening of students for learning disability is uncommon practice across higher
education. However, diagnostic testing which may offer an indicator of such
difficulties so that particular students can be referred for testing would be an
enhancement of our provision. Such diagnostic testing should not form part of the
summative assessment of a course though it may be compulsory. The detail of
Quality Team Final draft 3
implementation is best considered by the Personal Transferable Skills Working
Group and the appropriate officers in the College.
c) Be cautious about the over-formalising of formative assessment at the
expense of regular formative assessment especially if it is generating
excessive administrative burdens for staff. Be clear in the new policy about
It is recognised that students, particularly in the disciplines in which the College
operates, favour informal oral feedback over the written feedback received in
summative. However, the College has a duty to ensure that there is a
consistency of approach in terms of written feedback. The College is also
accountable for the quality of its provision to its validating partner and external
agencies. It must demonstrate how it knows that students are getting sufficient
and appropriate feedback. It is recognised that there is staff perception that
current written feedback mechanisms are burdensome. However this could be
addressed by simplifying existing paper work to ensure that effective written
feedback can be given in an efficient manner.
The Assessment Working Party believes that the development of a „fast‟
assessment sheet preloaded with learning outcomes and assessment
criteria specific to each unit would facilitate written feedback without over
A pilot feedback form is being developed for implementation on the bridging units
run over the summer period.
d) Continue to review and evaluate internal examination practice to ensure
rigor is maintained.
In the current year the College has introduced increased rigour into its internal
examination procedures. Internal moderation boards have been formalised and
are now supported by Registry.
The Assessment Working Party recommends the formal approval of the
Guidance on Internal Moderation and Final Examination Boards (see
Appendix D) subject to any minor revisions resulting from the experience
of running the Boards in the current academic year.
The administration of assessment within the College has been opaque. The
conduct of Examination Boards (and Progression Boards) has been supported by
Registry and in the current year this has extended to support internal moderation
boards. However, the conduct of unseen examinations is supported by Student
Services (i.e. invigilation etc).
Quality Team Final draft 4
The Working Party believes that the administration of assessment should
be responsibility of one department within the College and that in line with
other institutions that this should be Registry. This should include the
administration of unseen examinations.
e) Devise clear regulations about the practice of double marking with
appropriate guidelines and documentation to support this practice (e.g.
second marker’s sheets or an addition to the existing assignment
The current Guidelines for the Moderation of Assessment do not specify exactly
the College’s exact requirements in regard to internal moderation. For instance, it
does not state the quantity of marking which should be moderated (or ‘verified’ as
is used in some institutions). Practice in peer institutions and at our validating
partner, the University of Sussex suggest that sampling is appropriate in the
lower levels of an award but a comprehensive approach is needed in the final
level of an award or in any unit which contributes to the classification of an
The Guidelines for the Moderation of Assessment should be revised to
i) all final level marks be internally verified i.e. be subject to the
scrutiny of more than one assessor. This may be by group marking,
double marking or blind marking;
ii) 10% of all work in each unit in levels other than the final level should
be internally verified i.e. be subject to the scrutiny of more than one
assessor. This may be by group marking, double marking or blind
iii) Registry/MIS have developed new marksheets to support this
internal moderation practice which include provision for the
signatures of two staff. No unit results should be submitted to a
Final Examination Board without the signatures of two staff
certifying that the unit has been internally verified.
f) Initiate a cross-college internal moderation process for written
assignments with representation from other teams at exam boards.
Faculty Verification Groups were introduced in the current academic year with a
remit to scrutinise and verify Project Briefs. Initial experiences show that while
these have been successful, a sampling methodology is the most effective
means of carrying out this function.
Quality Team Final draft 5
The Assessment Working Party recommends that the faculty verification
groups are re-visited in September as they have only been operational for 2
The Assessment Working Party does not believe that the most effective means of
gaining consistency in marking is through cross College representation on
Examination Boards. Effective internal moderation procedures (see
recommendation 6 above) are more likely to be useful in this regard. The
smallness of the College and the emphasis on cross disciplinary projects means
that there is already a lot of cross course personnel involved in assessment and
to a lesser degree in Examination Boards. The Head of Faculty is normally in
attendance at each Examination Board and the Head of Registry is on hand to
advise on procedure. Annual Course Monitoring and other measures ensure that
differences in result profile are a genuine reflection of subject culture.
g) Review procedures and documentation for the accreditation of prior
learning and prior experiential learning.
Revised Guidelines on the Accreditation of Prior Learning were issued by the QAA in
September 2004. The Assessment Officer has conducted an analysis of our procedures
against these guidelines and revised our procedure accordingly (Appendix E).
The Assessment Working Party recommends to QDC the draft Policy on
Accreditation and Prior Learning set out in Appendix B.
h) Continue to work with employers to move towards the inclusion of
employer assessment on Foundation Degrees.
The inclusion of employers in assessment is seen as desirable in the QAA
Foundation Degree Qualification Benchmark. Many institutions have been
cautious about the implementation of employer assessment because of the risks
associated with practice (i.e. parity of student experience/ guaranteeing that
assessment will be criterion based). The College has traditionally shared this
reluctance. Work Based Learning is a feature of many of our courses (not
exclusively the foundation degrees). Employers who while open to the provision
of placement opportunities and involvement in Work Based Learning through live
and collaborative projects, have often expressed a reluctance to involve
themselves in the burden and formality of assessment. There are many instances
across the College of employer involvement in the setting of briefs and in
formative assessment. However, there is still considerable reluctance on the part
of staff to move beyond this position. The Working Party recognises that
employer involvement enhances many areas of the College provision. However,
such involvement should be accompanied by strict guidelines on the nature of the
involvement and the safeguards in place to secure the assessment process. The
Working Party recommends:
Quality Team Final draft 6
That there is further investigation of how employers are involved in
assessment on foundation degrees at other institutions before the College
position is formalised.
i) Review Assessment methods and practice in relation to work experience.
Revalidation affords course teams an opportunity to look at the integration of
placement learning and other forms of learning on their courses and the degree
to which the assessment of placement learning supports this.
The Assessment Working Party recommends re-examining the validated
assessment of placements in relation to each course as part of the
j) Continue to review systems and provide clear procedures for the recording
of assessment decisions with related documentation.
The Assessment Working Party recognises the need for robust procedures and
supporting documents. As referred to above, a number of enhancements have
been made to the documentation supporting the conduct and recording of
In the course of its discussions of internal moderation (verification) and the
auditability of our assessment processes, the Working Party came to recognise
the need for guidelines in regard to the retention of student work. However, it also
recognises that in the discipline areas in which the College operates that this may
not be as straightforward as in more script based subject areas.
The Assessment Working Party recommends that further consideration
needs to be given to guidelines on the retention of student work,
particularly large-scale practical work, where retention poses difficulties to
the College and may be detrimental to students.
The Assessment Working Party wishes the Quality Development Committee to
Compensation – The Assessment Working Party was unable to reach a
conclusion over whether to allow for the compensation of 20% credit
failure within one level. Current regulations allow only for compensation
within a unit. Compensation of failure of a unit is common practice in other
Quality Team Final draft 7
higher education institutions. It allows an Examination Board to condone
the marginal failure of a unit(s) in the case of students who have
demonstrated high levels of achievement of the learning outcomes in other
areas of the course (Appendix D, Section 13).
Retrieval of Failure – Capping – Current regulations are punitive in
respect of the failure of a unit particularly in the final year. Retrieval of
failure is currently capped at an E. Within the current algorithm, the
arithmetic effect of such a result on the classification of an award was
purposefully disproportionate. However this effect will be cancelled by the
introduction of percentage marking. Some institutions do have a higher
cap (i.e. ‘D’) for the retrieval of failure than the retrieval of non-submission.
However, the Working Party cautions against this added layer of
complexity. The real problem is the provision in the third level that an ‘E’
grade at the honours level results in the capping of the classification of the
degree at ‘Pass without honours’. This seems to some members of the
Working Party to be ‘double counting’, out of line with the principles of
credit accumulation inherent in a unitised course framework and punitive.
There are strong arguments for not retaining these provisions.
Retrieval of Failure – Number of Attempts – Current regulations allow
for only one attempt at retrieval. Many institutions allow two attempts.
There are strong arguments for allowing the Exam Board the discretion to
allow another attempt. This facilitates the setting of retrieval tasks by
Course Leaders early in the year to prevent the ‘snowballing’ of failed
credit. Where a student fails this retrieval task, it allows the Exam Board
the discretion to set a retrieval task in cases where it is believed a student
could retrieve the unit in a viable period (i.e. over the summer).
Deferral – The current regulations are silent on this. It is proposed that the
College introduce rules governing these. Henceforth, students would only
have an automatic right to defer at the beginning of the academic year (i.e.
on successful completion of the previous level). All other applications
would be by special permission of the Head of Faculty. Such deferrals
would be allowed only on a term-by-term basis (i.e. students would have
to return at the beginning of the term from which they deferred).
Grading Descriptors – The Assessment Working Party recognises that
these are dated and in need of review. An external examiner during a
recent visit referred to them accurately as a ‘deficit model’. The
Assessment Working Party is inclined to a more detailed model than those
currently used. This would be based on generic assessment criteria (i.e.
analysis, knowledge etc) with specific grading descriptors relating to each
of the criteria. This would allow a stronger relationship between the
College Grading Descriptors and the assessment criteria used in units.
Extenuating Circumstances – The Working Party recognises the need
for some form of verification of extenuating circumstances approved by
staff to ensure that its procedures are rigorous and fair. However, given
the size of the College, the addition of a further board is likely to be
Quality Team Final draft 8
burdensome. The Working Party believes that further thought needs to be
given to the mechanisms that might be used for this function.
Degree Classification Margins – Initially, it was envisaged that the new
system of classification would be entirely arithmetic being derived from the
weighted average of the units in the final level of the course. So students
with weighted averages falling in .5 below a classification will be rounded
up automatically and fall in the upper classification (i.e. notionally 2.1 = 60
to 70 but in practice is 59.5 to 69.5). However, there may be a case for
recognising the achievement of students who fall just below the thresholds
of a classification through the application of condonement criteria similar to
those used in the current year. So, for instance, the first grade could be
gained by weighted average (i.e. 69.5 or above) or by a lower weighted
average and the achievement of specific criteria (i.e. 68% plus half of
credit is A and no units less than C).
It can be seen that there are a number of areas requiring further development
and that the Working Party has identified additional areas needing attention. For
this reason, the Assessment Working Party recommends that the life of the group
be extended until the end of December 2005. The Assessment Working Party
has carefully considered that this increase in the time scale is needed to achieve
all points on the terms of reference.
The membership of the Assessment Working Party was as follows:
Bill Schaaf Senior Lecturer Digital Media (Chair)
John O’Boyle Head of Quality
Sue Cowan Student Welfare Officer
Michael Davidson Senior Lecturer Broadcasting
Anne Pascall Assessment Officer
Renate Divers Head of Registry
Steven Bowman Head of Student Information Services
Christine Roberts External Consultant
Neal White MA Representative
Louise Lidington FE Representative
Mary Hutton External Consultant - HFCE
Rachel Sugarman Quality Officer (Secretary)
Quality Team Final draft 9