Our experience in
evaluating drug abuse
UNODC Prevention Treatment &
CICAD VII Meeting of the Expert Group on
Demand Reduction, 13-15 September 2005,
UNODC has carried out two kinds of work with
regard to monitoring and evaluation (“m & e”) of
drug abuse prevention:
1 -- Assessing the progress of member states
(“MS”) in meeting the commitments they took
in the Political Declaration of 1998 (including
drug abuse prevention, treatment and
2 -- Identifying and disseminating good
practices in monitoring and evaluating drug
abuse prevention activities and programmes
implemented by youth- and community-based
1 -- Assessing Member
States’ Drug Abuse
UN Member States Report on Prevention
Activities through Questionnaires (BRQs)
• With regard to prevention, the Questionnaire
• Whether MS have implemented drug abuse
prevention activities in different settings (yes/no)
• If yes, whether the coverage of the activities is
low/ medium/ high
• Whether the activities are sensitive to gender
• Whether they have been evaluated (yes/no)
Limitations of Questionnaires
• Provide the perception of Member
• Provide limited information
• Only on implementation, not on impact (the
questionnaire only asks whether the
activities have been evaluated or not, it
does not ask about the results of the
• Yes/no, low/medium/high kind of answers
Still some useful indication, for example,
about the evaluation of prevention
% of Member States reporting that their prevention activities have
1998-2000 2000-2002 2002-2004
How does the UNODC Questionnaire relate to
other existing regional instruments for
measuring the extent of prevention
• Questionnaire to be reviewed in
October/ November 2005 in Vienna
• CICAD, EMCDDA represented
• Also to see how the monitoring work
can continue after 2008
2 -- Monitoring and
evaluation of drug
abuse prevention by
youth- and community-
How we identify good practice
• Review of the (academic) literature identifies
principles and issues
• Principles/ issues are discussed and enriched
in meetings including youth/ prevention
workers and youth from all regions
• Results are also circulated and discussed
with focal points in national and international
• Next publication: MONITORING & EVALUATION!
• Next piece of work: Prevention of Amphetamine-
All available on our website!
Monitoring & Evaluation
Note: These are the definitions we find useful, we are
aware that there are grey areas and that terminology
is being used differently.
• Monitoring is about implementation of
activities. It takes place during and feeds into
• Evaluation is about the impact of activities. It
takes place ‘after’ implementation and
assesses changes in the situation of the
target group, including, but not limited to what
was done (implementation).
What (should be evaluated)?
• Preventing use?
Assessing impact in terms of drug abuse
prevention might be counterproductive
• The activities of most organisations are too limited in the no.
of risk/protective factors they address, in coverage, in
intensity, in duration.
• To be valid, the kind of statistical analysis required is
complex and/or requires too large a sample
• Change in protective factors?
Assessing impact in terms of whether the risk/
protective factor situation has changed (on the basis
of evidence of link to drug abuse prevention)
Example of a small youth group with the (long term)
goal of decreasing the number of youth starting to
use substance in their community
• IDENTIFIED RISK FACTOR 1 -- Poor
communication between parents and youth
• (IMMEDIATE) OBJECTIVE 1 -- By the end of our
project, the communication between parents and
youth of our community will have improved.
• INDICATORS OF ACHIEVEMENT OF OBJECTIVE 1 --
Number of meals taken together by families has
increased -- Youth report better communication with their
parents, including on drug abuse issues
• ACTIVITIES PLANNED IN ORDER TO ACHIEVE
OBJECTIVE 1 -- Parenting skill session after school
once a week for two months -- Free family meals once a
week -- Family picnics once a month
• IDENTIFIED RISK FACTOR 2 -- Youth have too
much time in their hands with not much to do
• (IMMEDIATE) OBJECTIVE 2 -- By the end of our project,
the youth of our community will be more involved in
constructive activities in their free time
• INDICATORS OF ACHIEVEMENT OF OBJECTIVE 2 – No. of
youth involved in a constructive activity at least twice a week in
their free time increased – No. of youth spending their time
chatting in the street diminished
• ACTIVITIES PLANNED IN ORDER TO ACHIEVE OBJECTIVE 2
-- Organise sports training including a health promotion
component & participate in competitions -- Assist youth in
organising or finding other activities including a health promotion
How? A couple of basic principles
• (At least) collect baseline data or collect data
as time goes by to show how the situation
• Use a variety of methods to collect your
information to validate it (triangulation)
• To evaluate you also need good monitoring.
How can you say that what you did is
effective, if you do not know what you did in
the first place?
How? The methods
• Surveys through (self administered)
• Not easy! Especially to get the sampling right and
to create a simple but effective questionnaire
• Labour intensive! Testing the questionnaire,
ensuring anonymity and confidentiality, analysing
• Provides numbers, which people (and donors) like
How? The methods
• Key informant interviews
• Provide a series of very specific points of view (‘biased’
• Can give very useful insight, if the information is triangulated
• Group discussions (including Focus Group
Discussions; visual techniques, e.g. mapping; drama
based techniques, e.g. role playing)
• Provide quickly the point of view of a group of similar people.
Extrapolation is not easy, but still VERY useful insights
• Need experienced facilitation and a setting that engenders
trust (e.g. not in a place where adults can listen what the
youth are saying)
Who (should be involved)?
• Staff, (young) volunteers and youth participants
• To maximise the relevance of the evaluation to the organisation,
they can and should be involved in the planning, undertaking
analysis, and reporting. However, they will need support and/or
• Important stakeholders (administrators in schools and in the
community, health and social workers, religious leaders, donors,
• Not everyone needs to be involved in everything, but kept informed
at crucial points, so that they can facilitate the undertaking of the
evaluation (permission to access information/ youth/ stakeholders;
statistical advice; etc.)
• External evaluator
• Evaluators lend credibility to results, but are expensive and need
follow up. Hiring an evaluator should be a conscious ‘investment’
decision on the part of an organisation that wants to undertake a
more complex evaluation (more for advocacy than for learning?)
Your decision will depend on
why you are evaluating!
• Your donor told you?
• Many decisions will have been taken for you.
• To improve your programme?
• An organisation wide reflection on which activities were
implemented, the feedback of participants and some indication of
impact in terms of risk and protective factors will be very useful.
• To advocate among donors and the community?
• Results of a self evaluation (see above) including simple data, a
few interviews and focus group discussions can go a longer way
than you think!
• To show that your programme has a drug abuse prevention
• Your programme might have run for long enough, with enough
coverage and intensity that you might think: yes, this is the time to
invest time and money to show that we are preventing drug abuse!
You will need a good external evaluator and possibly a control