13th Improving Student Learning Symposium, 2005
Effective feedback through overt use of criteria and target setting
Colin Hughes, School of Biomedical and Natural Sciences, Nottingham Trent University
A shared understanding of the assessment and feedback processes is vital if
students are to gain maximum benefit in developing their learning and staff are to
support this effectively.
The EFEL (Effective Feedback, Enhanced Learning) Project has been supporting
student understanding of generic grade-based criteria and task-specific assessment.
Transparent criteria have been devised for all assessment tasks. A range of
feedback strategies and methods have been trialled and data gained on student
interaction with feedback.
While some elements of specific knowledge and understanding have little potential
for feeding forward into later pieces of work, aspects of skill development within the
writing of reports, advice on the balance within reports, and comment on the depth
and breadth of scientific inputs have potential to inform future work. Helping
students to interact with feedback in such a way that it feeds forward to future pieces
of assessed work has been aided by target setting. Some early data and
experiences will be commented upon.
This paper explores how a better understanding of the assessment and feedback process
and more overt use of assessment criteria can improve the performance of some students.
The principles of effective feedback have been identified by Nicol and Macfarlane-Dick
(2004). Glover, Macdonald, Mills and Swithenby (2004) and data collected at Nottingham
Trent University indicate that students perceive feedback as being most effective when it is
individualised and written. Often they do not recognise the importance or significance of
generic oral feedback. Despite the enormous effort expended in designing assessment
tasks and the vast amount of lecturer time absorbed in marking and providing feedback,
there is evidence that students rarely use the comments offered to enhance the quality of
their later work (Brown, Glover, Stevens and Freake, 2004). It was with this in mind that
the EFEL project has developed target setting strategies that help students who do interact
with the feedback to respond appropriately as they tackle their future work. Feedforward is
enhanced (Hughes, 2005).
Project Aims and Activities
The EFEL project was specifically designed to identify, build on and disseminate good
practice in the effective and consistent application of grade-related criteria and task
assessment criteria, and in the assessment of students’ work and the provision of
feedback. While the initial focus was on embedding existing good practice for the benefit
of students and staff, the project has gradually increased its research dimension.
9. Task- Groups 5.
Assessment & Feedback
7. Generic 4.
Grade Criteria Exemplar
Rewrite at Students’
3. Generic 6.
Workshops Task Audit
8. Staff 2.
Workshops Input into
Figure 1. Improving students’ and staff understanding of the assessment and feedback
Improvement has been achieved by the series of measures indicated in Figure 1. Too
often it is assumed that students understand the assessment and feedback process, that
they understand the language and terminology used and that the assessment task and
criteria are written in a way that offers them easy access to the task and the staff
expectations associated with the task. Invariably, the reality does not match this ideal and
a gap develops between staff expectations and student response to the task. The actions
taken are intended to bridge that gap.
1. Student focus groups. Students readily engaged in sharing with staff the strengths and
perceived weaknesses of assessment tasks and feedback and the effect of the EFEL
project on these aspects.
2. Input into induction. Inputs on assessment and feedback are made at induction and at
regular intervals thereafter through programme meetings. This includes reflecting on staff
feedback and use of target setting.
3. Generic grade criteria workshops. Workshops are offered helping students to
understand the terminology within the criteria and the expectations behind allocating marks
concordant with particular grade thresholds.
4. Exemplar students’ work. Student work that has been graded for a particular section, eg
the Introduction, Results or Discussion, has been made available to students to help them
understand further the criteria and the depth and breadth of response required. Some
difficulties have arisen here due to the subject specific nature of some parts of the
materials; the generic ‘skills’ aspects are sometimes overlooked.
5. Assessment and feedback audit. An overall picture of assessment and feedback
strategies was obtained through data obtained from consortium and associate partner
6. Module assessment task audit. An analysis of the number of assessment tasks within a
module of a particular credit point rating indicated relatively large discrepancies particularly
with regard to student work demand both within and outside the formal teaching time.
7. Generic grade criteria. Generic grade criteria have been reconsidered and put into
context within levels 1-3 of undergraduate programmes.
8. Staff workshops using generic criteria. ‘Agreement trialling’ has taken place involving
laboratory files and formal reports, with two standard formats devised for students to report
on scientific laboratory reports. This has helped consistency of marking and more
common understanding of the criteria.
9. Task-specific criteria assessment sheets (TSCASs). This is a student entitlement for all
assessed tasks. Currently, TSCASs are being revised to provide a progression of assistance
and hence student autonomy as they move from Levels 1 to 3 in their undergraduate studies.
There has also been a rise in student achievement as indicated by end line marks and degree
classifications awarded, though clearly this is influenced by many factors.
The learning environment that we have sought to achieve is represented in Figure 2.
CRITERIA FOR THE TASK
IMPROVEMENT STUDENT UNDERSTANDS
CARRY OUT GENERIC GRADE CRITERIA
STAFF MARK USING
STUDENT READS TASK-SPECIFIC AND
AND INTERACTS GENERIC GRADE CRITERIA
Figure 2. The assessment and feedback process and how strategies within the EFEL
project support students and staff.
The overt use of criteria, a major theme of the action taken, is supported by task specific criteria
assessment tasks (TSCASs). They are a student entitlement and are being developed by all
staff as elements of good practice. They inform students of aspects to be assessed
and identify the mark allocation. There is a progression in the level of support offered with more
at level 1 and less at levels 2 and 3. They are being developed to a variety of formats, though a
common format might be used after evaluation later in the project.
Two examples of TSCASs are given below indicating the progression of support for
student learning. These are illustrated for the same task as if it were set at both levels 1
and 2. The task specific criteria are reflected directly in the feedback given to students, so
furthering their understanding of the assessment process and helping them to understand
the mark given.
At Level 1:
Criteria for assessment Max Mark and comments
mark related to the task (and
generic grade) criteria
Appropriate title and abstract 20
Relevance of introduction to the study
Discussion of biological indicators
TBI and BMWP
Organic pollution & depletion of oxygen by
Self-purification and O2 sag curve. BOD
Accuracy of TBI and BMWP
Selection of key graphs (eg BOD
Plotted against DO2 and BOD or DO2 plotted
against TBI or BMWP / ASPT) and
Discussion: Etc. 50
And at Level 2:
Criteria for assessment Max Mark and comments
mark related to the task (and
generic grade) criteria
Title, abstract and introduction: succinct 20
Results: tables of data, calculations, graphs 15
Discussion: Etc. 50
Evaluations from students and comments from External Examiners have indicated that
strategies developed and/or improved by the project such as the use of effective grade-
related criteria and task-specific criteria have improved confidence in the consistency of
marking and made moderation easier.
The views of approximately 100 students were obtained one term after introduction
of the strategies using questionnaires within a format where a 100% response rate
Students’ perceptions of TSCASs have been very favourable with 83% of students
reporting that they were useful or very useful and no responses that were totally negative.
Very Useful Fairly Not very Not at all
Useful Useful Useful Useful
Figure 3. Student perceptions of TSCASs.
Student views of the assessment sheets that relate feedback to the assessment criteria
were even more very positive. Again there were no negative reports
V e ry Us e ful Fairly Not ve ry Not at all
Us e ful Us e ful Us e ful Us e ful
Figure 4. Student views of assessment feedback sheets related directly to the criteria on
Using questionnaires allowing free format responses, a survey of 100 students within the
EFEL project indicated that students value both information on where they have gone
wrong and the encouragement of being told where have been successful. They want
legible feedback (an issue for one institution in the project), constructive criticism and help
with improving future work (i.e. feedback that feeds forward).
The same students do not value feedback that is illegible or that is marked inconsistently
with the published guidance. They are dissatisfied with receiving only a mark, with
feedback that lacks detail or that is perceived as being destructive.
Sources in the literature (eg Gibbs and Simpson, 2004). suggest that students value timely
feedback. Our survey indicates that student value detailed, less prompt feedback more
highly than less detailed, more prompt feedback. This is corroboration of the value that
they place in feedback and indicates a desire to receive as much support as is possible.
40 Rapid generic
Less detailed, m ore
prom pt feedback
More detailed, less
10 prom pt feedback
Figure 5. Student feedback preferences.
Students consider that staff feedback on the mark (feedback) sheet and the script are preferable to
comments being made on either the script or the mark sheet alone. Again, this indicates that
students value detailed feedback and do not consider that feedback in two places is particularly
difficult to interpret.
60 Feedback on script
40 Feedback on
20 Feedback on
Figure 6. Student feedback location preferences.
It has been reported that it is rare for students to use feedback to enhance the quality of
their future work (Brown et al., 2004). We have no evidence about specific usage but
students do appear to interact with the feedback and spend some time considering it.
Some 67% of students indicate that they read all feedback comments whether made in
summary or on the script, with 27% of these also reading their work to put the comments in
60 Mark and summary
40 Mark, summary and
Read over work
Figure 7. Student interactions with feedback
These data suggest that there is potential for considering methods to assist students to
interact in a more focused fashion with feedback. The information provided should feed
forward into their next piece of work. Brown and Glover (2005, this volume) indicate how
this feed forward is important in the development of learning.
Feedback that can usefully be fed forward? Not every aspect of feedback is useful in
feeding forward to the next piece of work. As students’ scientific work is almost invariably
framed within a specific science knowledge context, it is not surprising that most students
see this knowledge base as the focus for their energies when, for example, they are writing
reports, particularly if these are ‘formal reports’ requiring recourse to the literature. It
follows that feedback given by staff on the science knowledge and understanding aspects
of one report is unlikely to have formative power for the next report, as the latter is likely to
be on another topic. Of course, there may be some feed forward to revision and for
However, there is potential for feeding forward generic skills. For example in the writing of
reports, feedback may relate to the drafting of; an effective title, abstracts containing the
important features, results sections which are correctly titled, and appropriate displays of
data with units displayed. These are relatively low level ‘skills’ and are considered
relatively easy to develop. There is also the opportunity for feeding forward advice on
higher level skills relevant to the student’s synthesis and evaluation activities, eg the
breadth and depth of scientific content introduced and discussed in relation to the scientific
literature,. However, these higher order ‘skills’ may be more impervious to remediation,
as students do not find it easy to apply feedback obtained in one scientific context in
another. The data obtained from students indicates that there is a need to shift perceptions
about the scope for feedback that feeds forward.
Target setting; feeding forward to enhance learning. Students within the EFEL project
respond well to clear, overt criteria when they understand the terminology of generic grade
criteria and task specific criteria. They require detailed feedback from staff that informs
them of the gap between their work and staff expectations and also confirms to them those
areas that were carried out competently (Gibbs and Simpson, 2004). Most students
interact in some detail with the feedback they receive but few use it effectively. Therefore,
in order to complete the loop in the staff feedback and student feed forward process,
students need to be encouraged to set personal targets as they interact with feedback and
to use this information as they carry out the next piece of assessed work.
Target setting has been successfully used in UK teacher education for some time in
achieving the Secretary of State’s ‘Standards’. It offers students an opportunity to take
responsibility for learning and helps them to understand the complex assessment process
more clearly and quickly.
Initial attempts to encourage students to set targets based on the feedback they had
received demonstrated that they needed initial help and was not very successful. They had
been given no format or structure to assist them in the process and they did not fully
understand what they were being asked to do. They found the analysis of their feedback
and then the transfer of this information into targets difficult. This led to the following
template format being devised and trialled:
FORMAL REPORTS and TARGET SETTING
This form is to try to help you to identify TARGETS for improvement on your next FORMAL
• Look at the comments on the script of your ‘Catalase’ formal report and particularly
the ‘Assessment Sheet’.
• Identify targets to achieve next time
A. FORMAT OF THE REPORT eg number Tables / cite references correctly.
B. CONTENT OF THE REPORT eg more detail in Introduction / Discussion.
STUDENT NO. ..................................... Date: ..........................
However, many students have still found difficulty with setting their own targets based on
feedback. Staff are now including a target within their feedback / feed forward and
seminars are being held to support students’ understanding of the process using their own
scripts that have recently been returned. The target sheet so compiled is then submitted
with the next piece of work and comments are made by staff as to whether the targets
have been achieved or not.
After nearly a year working on target setting with one group of foundation degree students,
only 20% write down their targets on a regular basis though another 20% claimed that they
did now use previous work to feed forward to their next piece of assessed work. The
comments of one student are worthy of repetition to illustrate the potential power of the
‘I am, a student at the NTU studying FdSc Biology. Throughout this year I have
completed a number of formal reports and assignments. At the end, when the reports
have been handed in and marked, I received them back with comments and my grade.
For the next report I used the comments on the formal report I got back, to improve my
next formal report. I did this for all of my work. When I started this year for every report
we received we were given target sheet. At first I did not think they were going to make
a difference because I didn’t really understand how to fill them in. In my last module,
which was Natural Systems, I completed the target sheet using the comments received
back from previous reports and wrote down how I was trying to improve them and at the
end of the report which target I had achieved and to what extent.
When I received back my report it boosted my grade from high 50’s/low 60’s to 74%, it
really helped and improved my work. I regret not using it for all other reports as I could
have achieved so much better. It was like a little guideline which I stuck by throughout
my report. I think it’s an excellent way to improve students’ work and should definitely be
used more in the future. It’s a shame not every lecturer used them because if it helped
me in one of my modules it could have helped in all of them. It’s quick and easy to fill one
out and it could really help to boost grades up by even 10%. It’s also an advantage to
lecturers to see where students are struggling and need more help’.
Brown, E, Glover, C, Stevens, V. and Freake, S. (2004) Evaluating the effectiveness of written
feedback as an element of formative assessment in science. Proceedings of the 12th Improving
Student Learning Symposium, C. Rust (Ed.), The Oxford Centre for Staff & Learning
Development, Oxford Brookes University, pp. 470-479.
Brown, E. and Glover, C. (this volume) Refocusing written feedback.
Gibbs, G and Simpson, C (2004) Does your assessment support your students’ learning?
Learning and Teaching in Higher Education (on-line), 1(1), 3-31. Available online at
http://www.glos.ac.uk/adu/clt/lathe/issue1/index.cfm (accessed 1st November 2005.
Glover, C, Macdonald, Mills, J and Swithenby, S (2004) Perceptions of the value of different
modes of tutor feedback. Proceedings of the 12th Improving Student Learning Symposium, C.
Rust (Ed.), The Oxford Centre for Staff & Learning Development, Oxford Brookes University, pp.
Hughes, C.E. (2005) Headlines from the EFEL project enhanced learning through target setting.
Proceedings of the Science Learning and Teaching Conference 2005, P Goodhew et al., (Eds.)
The Higher Education Academy Subject centres for Biosecience, Materials and Physical
Sciences, pp. 75-78.
Nicol, D and Macfarlane-Dick, D (2004) Rethinking formative assessment in HE: a theoretical
model and seven principles of good feedback practice. Available online at
http://www.heacademy.ac.uk/assessment/ASS051D_SENLEF_model.doc (accessed 3rd