Monitoring and Evaluation of
Simon Hearn, ODI, firstname.lastname@example.org
Ewen LeBorgne, IRC International Water and Sanitation Centre, leborgne@Irc.nl
Valerie Brown, Australia National University, email@example.com
“It is, in fact, nothing
short of a miracle that the
modern methods of
instruction have not
entirely strangled the holy
curiosity of inquiry.”
- Albert Einstein
“When I use a word, it
means just what I choose it
to mean — neither more
Monitoring and evaluation
• OECD definitions:
– Evaluation: The systematic and objective
assessment of an on-going or completed project,
programme or policy, its design, implementation
– Monitoring: A continuing function that uses
systematic collection of data on specified
indicators to provide indications of the extent of
progress and achievement of objectives (abridged)
Monitoring and evaluation
• OK, but... Any definition must recognise:
– M&E as universal functions, not specialised roles
– Presence of different worldviews
– Validity of evidence from different knowledge
– The ethical basis for the desired social change
– The importance of the unexpected and the
• Objective and subjective
• Individual and society
• Facts and values
• Tacit and implcit
• E.g. Western scientific conception of
knowledge as ‘justified true belief’ vs African
concept of Ubuntu
• Often conceptualised as a service industry
• Delivery of even basic services (roads,
sanitation..) requires an understanding of the
social, political and economic contexts
• Thus, development is more like a knowledge
industry (Powell 2006)
• But development is more than donor aid and
we must recognise civic-driven change also
Challenges in M&E of KM4D
1. KM4D does not as yet have a well grounded theory
2. Knowledge for development practice is still young
3. KM4D goes beyond what is labeled ‘KM’
4. Competing ontological and epistemological perspectives
(and related knowledge systems)
5. Existing reporting frameworks are designed for a service
industry rather than a knowledge industry
6. There can be no simple cause-effect relationship
7. KM initiatives often lack explicit linkages to individual,
specialist, organisational or social results
8. Knowledge is not static
9. Lack of methods for interpreting intangibles
1. KM ripple model
Hulsebosch et al (2009)
2. The KM Framework
Need a better understanding of what
Relationship Capital through intangibles
Based on Talisayon (2009)
Need a better understanding of
Nonaka and Takeuchi (1995)
Need a better understanding of
how knowledge is put to use
Graham et al (2001)
Need a better understanding of organisational
factors affecting knowledge use
We need to understand the level of
Summary: a range of perspectives
• Ontological: What world-views are reflected in the
initiative and how do we recognise them?
• Epistemological: What are the knowledge domains
contributing to M&E and how do they relate?
• Socio-political: Who has a stake in the monitoring
process and who has power? How can we monitor
these interdependent relationships?
• Methodological: How to choose tools and approaches
relevant to the parties and processes involved?
• Operational: How do we organise M&E activities
according to each of the knowledge domains?
• Do you identify with these signposts?
• What signposts do you use?
• How do you see these models supporting
M&E as multiple partners
Multiple knowledges (Brown 2008)
Personal lived experience
Shared community event
Environment, Health, Finance…,
Organisational structure, aims
. Focus, vision
Collective knowledge as a nested set
A collaborative system
small town with the biggest lead smelter in the world
KNOWLEDGES STRUCTURE CONDITIONS
INDIVIDUAL Children diagnosed with lead
COMMUNITY People long resigned to risk
SPECIALIST Health Centre stays aloof
ORGANISATION Mine muzzles council
HOLISTIC FOCUS Fear for future livelihood
New alliances in Port Pirie
INDIVIDUAL Parent, grandparent
Outrage, political action
Technical skills, advocacy
ORGANISATIONAL Public/private good
HOLISTIC Children’s well-being
M&E as collective learning
- multiple interests
- multiple knowledges
- collaborative action
- The IKM-E approach
- Emergent questions on the horizon
Our approach: Multi-evidence based?
Each knowledge community uses different M&E criteria,
evidence bases, databases for judgments...
• Individuals (experiences)
• Communities (observations)
• Experts (practitioner stories)
• Organisations (monitoring reports as stated)
• Holistic thinkers (ideas, forecasts)
Our approach: Purposes of conducting M&E
• Financial accountability
• Operational improvement
• Strategic readjustment
• Capacity strengthening
• Contextual understanding
• Deepening understanding (research)
(From I. Guijt’s PhD thesis ‘seeking surprise’)
Our approach: KM as collective learning
Key to nested knowledge cultures:
Our approach: critical questioning
• A series of questions at each step of the way
– Overall, a sound questioning practice
– And specifically, a guideline to tailor one’s
• What questions to address?
• Who to involve, in what function?
• What tools and methods to choose?
• What lessons to draw from the approach?
Our approach: A nested iterative inquiry
Emergent questions on the horizon
• How would our approach work in practice?
• Specific methods and metrics to go ‘light’
• Particularly complexity-focused approaches
• Power vs. collective?
IKM-E + KMIC = IKMEKMIC?
• Avoiding overlaps...
– Connecting KMIC and IKM (blogs...)
– Organising another webinar?
– Identifying different models / approaches?
• Having creative leaps...
– Reviewing the IKM papers?
– Expanding parts of this paper?
– Testing the IKM-E framework (later)?
• IKM-Emergent website:
• The giraffe, Working group 3 blog
• Working paper 3: ‘Monitoring and Evaluation in
Knowledge Management for Development‘
• Background paper: ‘Monitoring and evaluating
knowledge management strategies’