A Guide For NOnprofit Organizations

Document Sample
A Guide For NOnprofit Organizations Powered By Docstoc
					Building Evaluation Capacity Presentation Slides for
Participatory Evaluation Essentials:A Guide for Non-Profit Organizations And Their Evaluation Partners

Anita M. Baker, Ed.D.
Bruner Foundation Rochester, New York

How to Use the Bruner Foundation Guide & Powerpoint Slides
Evaluation Essentials:A Guide for Nonprofit Organizations and Their Evaluation Partners. (the Guide) and slides are organized to help an evaluation trainee walk through the process of designing an evaluation and collecting and analyzing evaluation data. The Guide also provides information about writing an evaluation report. The slides allow for easy presentation of the content, and in each section of the Guide there are activities that provide practice opportunities. The Guide has a detailed table of contents for each section and it includes an evaluation bibliography. Also included are comprehensive appendices which can be pulled out and used for easy references, as well as to review brief presentations of other special topics that are not covered in the main section and sample logic models, completed interviews which can be used for training activities, and a sample observation protocol. For the Bruner Foundation-sponsored REP project, we worked through all the information up front, in a series of comprehensive training sessions. Each session included a short presentation of information, hands-on activities about the session topic, opportunities for discussion and questions, and homework for trainees to try on their own. By the end of the training sessions, trainees had developed their own evaluation designs which they later implemented as part of REP. We then provided an additional 10 months of evaluation coaching and review while trainees actually conducted the evaluations they had designed and we worked through several of the additional training topics that are presented in the appendix. At the end of their REP experience, trainees from non-profit organizations summarized and presented the findings from the evaluations they had designed and conducted. The REP non -profit partners agreed that the up-front training helped prepare them to do solid evaluation work and it provided opportunities for them to increase participation in evaluation within their organizations. The slides were first used in 2006-07 in a similar training project sponsored by the Hartford Foundation for Public Giving. We recommend the comprehensive approach for those who are interested in building evaluation capacity. Whether you are a trainee or a trainer, using the guide to fully prepare for and conduct evaluation or just look up specific information about evaluation-related topics, we hope that the materials provided here will support your efforts.
Bruner Foundation Rochester, New York

These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation.

They may NOT be sold or redistributed in whole or part for a profit. Copyright © by the Bruner Foundation 2007
* Please see the notes attached to the first slide for further information about how to use the available materials.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

Building Evaluation Capacity Session 1
Important Definitions Thinking About Evaluative Thinking
Anita M. Baker, Ed.D.
Bruner Foundation Rochester, New York

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

4

Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

5

Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

6

Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs,

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

7

Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people,

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

8

Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

9

Working Definition of Participatory Evaluation
Participatory evaluation involves trained evaluation personnel and practice-based decision-makers working in partnership.

It brings together seasoned evaluators with seasoned program staff to: Address training needs Design, conduct and use results of program evaluation
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

10

Evaluation Strategy Clarification
All Evaluations Are:
 Partly social  Partly political  Partly technical

Both qualitative and quantitative data can be collected and used and both are valuable There are multiple ways to address most evaluation needs.  Different evaluation needs call for different designs, types of data and data collection strategies.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

11

Purposes of Evaluation Evaluations are conducted to:
 Render judgment  Facilitate improvements  Generate knowledge

Evaluation purpose must be specified at the earliest stages of evaluation planning and with input from multiple stakeholders.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

12

What is an Evaluation Design?
An Evaluation Design communicates plans to evaluators, program officials and other stakeholders. Evaluation Designs help evaluators think about and structure evaluations.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

13

Good Evaluation Designs Include the Following
 Summary Information about the program
 The questions to be addressed by the evaluation

 The data collection strategies that will be used  The individuals who will undertake the activities
 When the activities will be conducted  The products of the evaluation (who will receive them and how they should be used)

 Projected costs to do the evaluation
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

14

Evaluation Questions Get you Started
 Focus and drive the evaluation.  Should be carefully specified and agreed upon in advance of other evaluation work.  Generally represent a critical subset of information that is desired.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

15

What about Evaluation Stakeholders?
Evaluation stakeholders include anyone who makes decisions about a program, desires information about a program, and/or is involved directly with a program. • • Most programs have multiple stakeholders. Stakeholders have diverse, often competing interests.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

16

Who are Evaluation Stakeholders?

 Organization officials  Program staff  Program clients or their caregivers
 Program Funders

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

17

What do you need to know about a program …. before you design an evaluation?
1. What is/are the purpose(s) of the program?

2. 3.
4.

What stage is the program in? Who are the program clients?

(new developing, mature, phasing out)

Who are the key program staff (and where applicable, in which department is the program?

5. 6. 7. 8. 9.

What specific strategies are used to deliver program services? What outcomes are program participants expected to achieve? Are there any other evaluation studies currently being conducted regarding this program? Who are the funders of the program? What is the total program budget?

10. Why has this program been selected for evaluation?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

18

Thinking About Evaluative Thinking

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

19

What is Evaluative Thinking?

Evaluative Thinking is a type of reflective practice that incorporates use of systematically collected data to inform organizational decisions and other actions.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

20

What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results
5. Developing strategies to act on findings
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

21

What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results
5. Developing strategies to act on findings
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

22

What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results
5. Developing strategies to act on findings
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

23

What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results
5. Developing strategies to act on findings
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

24

What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results
5. Developing strategies to act on findings
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

25

What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways

4. Analyzing data and sharing results
5. Developing strategies to act on findings
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

26

How is Evaluative Thinking Related to Organizational Effectiveness?
 Organizational capacity areas (i.e., core skills and

capabilities, such as leadership, management, finance and fundraising, programs and evaluation) where evaluative thinking is less evident, are also capacity areas of organizations that usually need to be strengthened.  Assessing evaluative thinking provides insight for organizational capacity enhancement.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

27

Why Assess Evaluative Thinking?
Assessment of Evaluative Thinking . . .
 helps clarify what evaluative thinking is

 helps to identify organizational capacity areas where evaluative thinking is more or less prominent (or even non-existent)  informs the setting of priorities regarding how to enhance or sustain evaluative thinking

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

28

What Organizational Capacity Areas Does the Bruner Foundation Evaluative Thinking Tool Address?
 Mission  Strategic Planning  Governance  Finance  Leadership  Fund Development  Evaluation  Client Relationships  Program Development  Communication & Marketing  Technology Acquisition & Training  Staff Development
 Human Resources  Alliances/Collaborations  Business Venture Dev.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

29

How Can Evaluative Thinking be Assessed?
Develop or locate a tool. Decide on an administrative approach and strategy:
– Individual vs. Team/Group – Timing of administration – Communicating about the assessment

Discuss how results could be used and plan for next steps.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

30

The Bruner Foundation
Evaluative Thinking Assessment Tool
ORGANIZATION MISSION
Assessment Priority

a.

The mission statement is specific enough to provide a basis for developing goals and objectives The mission is reviewed and revised on a scheduled basis (e.g. annually) with input from key stakeholders as appropriate The organization regularly assesses compatibility between programs and mission The organization acts on the findings of compatibility assessments (in other words, if a program is not compatible with the mission, it is changed or discontinued) Comments:
#DIV/0!

b. c. d.

Please proceed to the next Worksheet

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

31

The Bruner Foundation
Evaluative Thinking Assessment Tool
GOVERNANCE
Assessment a. b. c. d. e. f. g. Priority

Board goals/workplan/structure are based on the mission and strategic planning

Board uses evaluation data in defining goals/workplan/structure and organizational strategic planning
Board regularly evaluates progress relative to own goals/workplan/structure

There is a systematic process and timeline for identifying, recruiting, and electing new board members
Specific expertise needs are identified and used to guide board member recruitment The board regularly (e.g., annually) evaluates the executive director’s performance based on established goals/workplan Board members assess and approve the personnel manual covering personnel policy

h.
i.

The board assess the organization’s progress relative to long-term financial plans
The board assess the organization’s progress relative to program evaluation results Comments:
#DIV/0!

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

32

The Bruner Foundation
Evaluative Thinking Assessment Tool
TECHNOLOGY ACQUISITION PLANNING AND TRAINING
Assessment

Priority

a.

An assessment process is in place to make decisions about technology maintenance, upgrades, and acquisition Technology systems include software that can be used to manage and analyze evaluation data (e.g., Excel, SPSS)
Technology systems provide data to evaluate client outcomes

b.
c.

d.
e.

Technology systems provide data to evaluate organizational management
Technology systems are regularly assessed to see if they support evaluation

f.

Staff technology needs are regularly assessed
Comments:
#DIV/0!

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

33

The Bruner Foundation
Evaluative Thinking Assessment Tool
Bruner Foundation Evaluative Thinking Assessment
Organizational Capacity Area
1

Capacity Score*

Action Planning** (Select from list) Action suggested see priorities

Mission

50

2
3 4 5 6

Strategic Planning
Governance Leadership Finance Fund Development/Fund Raising

50
63 92 71 50

Action suggested see priorities
No action required in this area No action required in this area Action suggested see priorities Action suggested see priorities

7
8

Evaluation
Program Development

69
80

Action required see priorities
No action required in this area

9
10 11 12
13

Client Relationships
Communication and Marketing Technology Acquisition and Planning Staff Development Human Resources

80
80 67 67 33

No action required in this area
No action required in this area Action suggested see priorities Action suggested see priorities Action required see priorities

14
15

Business Venture Development
Alliances and Collaboration
Bruner Foundation Rochester, New York

50
40

No action required in this area
No action required in this area

Anita M. Baker, Ed.D.

34

The Bruner Foundation
Evaluative Thinking Assessment Tool
Evaluative Thinking Scores
Alliances and Collaboration Business Venture Development Human Resources Staff Development Technology Acquisition and Planning Communication and Marketing Client Relationships Program Development Evaluation Fund Development/Fund Raising Finance Leadership Governance Strategic Planning Mission 0 10 20 30 40 50 50 50 60 70 80 90 100 63 50 71 92 69 33 67 67 80 80 80 40 50

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

35

How Should Evaluative Thinking Assessment Results be Used?
1. Review assessment results. 2. Distinguish communications vs. strategic issues (where possible). 3. Identify priorities and learn more about strategies to enhance Evaluative Thinking. 4. Develop an action plan based on priorities and what’s been learned about enhancing Evaluative Thinking. 5. Re-assess Evaluative Thinking and determine the effectiveness of the action plan.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

36

Building Evaluation Capacity Session 2
Logic Models Outcomes, Indicators and Targets

Anita M. Baker, Ed.D.
Bruner Foundation Rochester, New York

Logic Model Overview

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

38

So, what is a logic model anyway?
A Logic Model is a simple description of how a program is understood to work to achieve outcomes for participants.
It is a process that helps you to identify your vision, the rationale behind your program, and how your program will work.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

39

And . . . .
Logic models are useful tools for program planning, evaluation and fund development.
Developing or summarizing a logic model is a good way to bring together a variety of people involved in program planning to build consensus on the program’s design and operations.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

40

Why use a Logic Model?
 Developing a Logic Model will help you get clear about what you’re doing, and how you hope it will make a difference.  You have the best knowledge of the context of your work and what’s important to you and your communities. Developing a Logic Model draws from what you already know.  A Logic Model will leave you with a clear, thoughtful plan for what you are doing and what you hope to achieve. This plan can be an advocacy resource, bring clarity to your message and help you tell your story.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

41

To Construct a Logic Model You Must Describe:
 Inputs: resources, money, staff/time, facilities, etc.

 Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.  Outcomes: changes to individuals or populations during or after participation.
Inputs Activities Outcomes

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

42

Here is an illustration that will help you create your own Logic Model.
Inputs Resources dedicated to or consumed by the program.
E.G. money  staff and staff time, volunteers and volunteer time facilities equipment and supplies

Contextual Analysis Identify the major conditions and reasons for why you are doing the work in your community

Activities What the program does with the inputs to fulfill its mission.
E.G. provide x number of classes to x participants provide weekly counseling sessions educate the public about signs of child abuse by distributing educational materials to all agencies that serve families Identify 20 mentors to work with youth and opportunities for them to meet monthly for one year

Outcomes Benefits for participants during and after program activities.
E.G. new knowledge increased skills changed attitudes modified behavior improved condition altered status

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

43

Let’s analyze an example logic model
Contextual Analysis
People in my community: • Have few job skills and are likely to have bad jobs or no jobs, and limited job histories. • Have few opportunities for job training, placement, or help to deal with issues that come up while on the job.

Ask yourself….
…do the outcomes seem reasonable given the program activities? …do the assumptions resonate with me and my experiences?

• Jobs exist, we just have to help
people find them. The absence of a job history perpetuates unemployment.

•Education can help people improve their skills. Being able to ask a mentor for advice is useful. •Job seekers need help with soft skills and technical training. • Personal, one-on-one attention and classes can inspire and support people in keeping jobs and establishing job histories.

…are there gaps in the strategy?

Short-term Outcomes

Activities
• Provide 6 weekly Soft Skills classes. •Identify on-the-job training opportunities and assist participants with placement. • Conduct 6 months of on-the-job supervised training and lunchtime mentoring sessions
Bruner Foundation Rochester, New York

-Participants learn Assumptions specific marketable skills and strategies Getting solid hard and soft to help them get and skills are the keep jobs. first steps to -Participants establish keeping a job. trusting relationships If people feel with mentors who can supported, they will keep answer questions and working. support them while they are involved in on-the-job training.

Longer-term Outcomes

- Participants maintain their employment and establish records that increase the likelihood for continuous work and better jobs.

Anita M. Baker, Ed.D.

44

Summarizing a Logic Model Helps to:
 Create a snapshot of program operations that shows what is needed, how services are delivered and what is expected for participants.
 Describe programs currently or optimally.

 Identify key components to track.
 Think through the steps of participant progress and develop a realistic picture of what can be accomplished.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

45

Important Things to Remember
Not all programs lend themselves easily to summarization in this format.

Logic models are best used in conjunction with other descriptive information or as part of a conversation.
It is advisable to have one or two key project officials summarize the logic model but then to have multiple stakeholders review it and agree upon what is included and how.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

46

Important Things to Remember
 When used for program planning, it is advisable to start with outcomes and then determine what activities will be appropriate and what inputs are needed.

 There are several different approaches and formats for logic models. This one is one-dimensional and limited to three program features (inputs, activities, outcomes).
 The relationships between inputs, activities and outcomes are not one-to-one. The model is supposed to illustrate how the set of inputs could support the set of activities that contribute to the set of outcomes identified. (Levels of service delivery or “outputs” are shown in the activities.)
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

47

Use Logic Models for Planning, Evaluation and Fund Development
Contextual Analysis
What is needed to address the context that exists?
What would be interesting to try? What do we need to respond to this RFP? Identify the major conditions and reasons for why you are doing or could do this work

Ask yourself….
…do the outcomes seem reasonable given the program activities? …do the assumptions resonate with me and my experiences?

…are there gaps in the strategy?

Short-term Inputs: What resources do we need,
can we dedicate, or do we currently use for this project? Outcomes: Assumptions What benefits for participants during and When do we after the program can think outcomes we or do we expect?

Longer-term Outcomes:
What do we think happens ultimately? How does or can this contribute to organizational and community value?

Activities: What can or do we do
with these inputs to fufill the program mission?

New knowledge? Increased skills? Changed attitudes? Modified behavior? Improved condition? Altered status?

will happen – will what happens initially affect or cause other longer-term outcomes?

How does this fit into our outcome desires overall?

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

48

Outcomes, Indicators and Targets

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

49

Outcomes are changes in behavior, skills, knowledge, attitudes, condition or status. Outcomes are related to the core business of the program, are realistic and attainable, within the program’s sphere of influence, and appropriate.  Outcomes are what a program is held accountable for.
Bruner Foundation Rochester, New York

What is the difference between outcomes, indicators, and targets?

Anita M. Baker, Ed.D.

50

 Indicators are specific characteristics or

What is the difference between outcomes, indicators, and targets?
changes that represent achievement of an outcome.

 Indicators are directly related to the

outcome and help define it.
 Indicators are measurable, observable,

can be seen, heard or read, and make sense in relation to the outcome whose achievement they signal.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

51

What is the difference between outcomes, indicators, and targets?
Targets specify the amount or level of outcome attainment that is expected, hoped for or required.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

52

Why measure outcomes?
 To see if your program is really making a difference in the lives of your clients  To confirm that your program is on the right track  To be able to communicate to others what you’re doing and how it’s making a difference
 To get information that will help you improve your program
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

53

Use Caution When Identifying Outcomes
There is No right number of outcomes. Be sure to think about when to expect outcomes.
1)Initial Outcomes
 First benefits/changes participants experience

2)Intermediate Outcomes
 Link initial outcomes to longer-term outcomes

3)Longer-term Outcomes
 Ultimate outcomes desired for program participants
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

54

Use Caution When Identifying Outcomes
 Outcomes should not go beyond the program’s purpose.  Outcomes should not go beyond the scope of the target population.  Avoid holding a program accountable for outcomes that are tracked and influenced largely by another system.  Do not assume that all subpopulations will have similar outcomes.  Consider carefully unintended and possibly negative outcomes.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

55

Identifying Outcomes: Consider This . . .
• Is it reasonable to believe the program can influence the outcome in a non-trivial way? • Would measurement of the outcome help identify program successes and help pinpoint and address problems or shortcomings?
• Will the program’s various “publics”–accept this as a valid outcome of the program? • Do program activities and outcomes relate to each other logically?

GET FEEDBACK
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

56

How do you identify indicators?
 Indicators are specific characteristics or changes

that represent achievement of an outcome.
 Indicators are directly related to the outcome

and help define it.
 Indicators are measurable, observable, can be

seen, heard or read, and make sense in relation to the outcome whose achievement they signal.
 Ask the questions shown on the following slide.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

57

Questions to Ask When Identifying Indicators
1. What does this outcome look like when it occurs?

2. What would tell us it has happened? 3. What could we count, measure or weigh?
4. Can you observe it?

5. Does it tell you whether the outcome has been achieved?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

58

Let’s “break it down”
Use the “I’ll know it when I see it” rule
The BIG question is what evidence do we need to see to be convinced that things are changing or improving? The “I’ll know it (outcome) when I see it (indicator)” rule in action -- some examples:

I’ll know

that retention has increased among home health aides involved in a career ladder program
the program

when I see a reduction in the employee turnover rate among aides involved in and when I see
Bruner Foundation Rochester, New York

survey results that indicate that aides are experiencing increased job satisfaction

Anita M. Baker, Ed.D.

59

“I’ll know it when I see it”
I’ll know that economic stability has increased among the clients I place in permanent employment when I see an increase in the length of time that clients keep their jobs
and when I see an increase in the number of clients who qualify for jobs with benefits

I’ll know my clients are managing their nutrition and care

more effectively
when I see my clients consistently show up for scheduled

medical appointments
and when I see decreases in my clients’
Bruner Foundation Rochester, New York

body mass indexes (BMI)
Anita M. Baker, Ed.D. 60

Remember! When Identifying Indicators . . .
 Indicators must be observable and measurable  Indicators may not capture all aspects of an outcome.  Many outcomes have more than one indicator. Identify the set that you believe (or have agreed) adequately and accurately signals achievement of an outcome.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

61

Examples of Indicators
Outcomes
Initial: Teens are knowledgeable of prenatal nutrition and health guidelines

Indicators
Program participants are able to identify food items that are good sources of major dietary requirements

Participants are within proper ranges for prenatal weight gain Intermediate: Teens follow proper nutrition and health guidelines Intermediate: Teens deliver healthy babies
Participants abstain from smoking

Participants take prenatal vitamins Newborns weigh at least 5.5 pounds and score 7 or above on the APGAR scale.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

62

What are Targets ?

Targets specify the amount or level of outcome attainment that is expected, hoped for or required.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

63

How do you Identify Targets?
Targets or levels of outcome attainment can be determined relative to: External standards (when they are available) Internal agreement • best professional hunches • past performance • performance of similar programs
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

64

Example of a Target
Outcome: Parents will read to their preschoolers more often. Indicator: Parent reports of increased reading time after coming to the program. Target: 75% of participating parents will report a 50 percent increase in how often they read to their preschoolers.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

65

Example of a Target
Outcome: Parents will read to their preschoolers more often. Indicator: Parent reports of increased reading time after coming to the program. Target: 75% of participating parents will report reading to their preschoolers for at least 15 minutes, 4 or more nights per week.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

66

Targets Some Words of Caution
 Performance targets should be specified in advance (i.e., when deciding to measure outcomes).  Be sure there is buy-in regarding what constitutes a positive outcome – when the program has achieved the target and when it has missed the mark.  Lacking data on past performance it may be advisable to wait.  Be especially cautious about wording numerical targets so they are not over or under ambitious.  Be sure target statements are in sync with meaningful program time frames.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

67

Building Evaluation Capacity Session 3
Evaluation Questions and Designs Documenting Service Delivery Enhancing Service Delivery

Anita M. Baker, Ed.D.
Bruner Foundation Rochester, New York

To Construct a Logic Model You Must Describe:
 Inputs: resources, money, staff/time, facilities, etc.

 Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.  Outcomes: changes to individuals or populations during or after participation. It’s easiest to embed targets here
 Indicators: Indicators are specific characteristics or changes that represent achievement of an outcome  Targets: specify the amount or level of outcome attainment that is expected, hoped for or required. Inputs Activities Outcomes Indicators w/ Targets Data Sources

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

69

To Construct a Logic Model You Must Describe:
 Inputs: resources, money, staff/time, facilities, etc.  Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.  Outcomes: changes to individuals or populations during or after participation. It’s easiest to embed targets here  Indicators: Indicators are specific characteristics or changes that represent achievement of an outcome

 Targets: specify the amount or level of outcome attainment that is expected, hoped for or required.
Inputs Activities Outcomes Indicators

Data Sources

Reports – staff, clients, sig. others Existing records – staff, clients Observation – staff, clients Test Results – staff, clients
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

70

Evaluation Strategy Clarification
All Evaluations Are:
 Partly social  Partly political  Partly technical

Both qualitative and quantitative data can be collected and used and both are valuable There are multiple ways to address most evaluation needs.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

71

What is an Evaluation Design?
An Evaluation Design communicates plans to evaluators, program officials and other stakeholders. Evaluation Designs help evaluators think about and structure evaluations.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

72

Good Evaluation Designs Include the Following (see Appendix 6)
 Summary Information about the program
 The questions to be addressed by the evaluation

 The data collection strategies that will be used  The individuals who will undertake the activities
 When the activities will be conducted  The products of the evaluation (who will receive them and how they should be used)

 Projected costs to do the evaluation
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

73

Evaluation Questions . . .
 Focus and drive the evaluation.  Should be carefully specified and agreed upon in advance of other evaluation work.  Generally represent a critical subset of information that is desired.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

74

Evaluation Questions: Criteria
• It is possible to obtain data to address the
•

• • •

questions. There is more than one possible “answer” to the question. The information to address the questions is wanted and needed. It is known how resulting information will be used internally (and externally). The questions are aimed at changeable aspects of programmatic activity.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

75

Evaluation Questions: Advice
 Limit the number of evaluation questions

Between two and five is optimal Keep it Manageable

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

76

Evaluation Questions: Examples
Eval. Questions
How was staff training delivered, how did participants respond and how have they used what they learned?

Data Collection/Protocol Questions
1. How would you rate the staff training you received? 2. Did the staff training you received this year meet your needs? 3. Has the training you received changed your practice? 4. Has the training you received lead to changes in . . .

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

77

Evaluation Questions: Examples
Eval. Questions
How was staff training delivered, how did participants respond and how have they used what they learned?

Data Collection/Protocol Questions
1. How would you rate the staff training you received? 2. Did the staff training you received this year meet your needs? 3. Has the training you received changed your practice? 4. Has the training you received lead to changes in . . .
1. What does the X program do best? . What is your greatest concern? 2. Do staff communicate with caretakers as often as required? 3. Did you receive all the services promised in the program brochure? 4. How knowledgeable are staff about the issues you face? 1. Have you changed the way you proceed with planning requirements? 2. Do you know more about guardianship now than before the program. 3. How would you rate this program overall?

How and to what extent has the program met its implementation goals?

What impact has the program had on participants?

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

78

Switching Gears

How are Evaluative Thinking and Service Delivery (Activities) related?

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

79

When Organizations use Evaluative Thinking . . .

• Client interaction includes collection and use of information. • Service delivery and program development include collection and use of information.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

80

Examples of Evaluative Thinking: Client Interaction
• Client needs assessments are conducted regularly. • Program services reflect client needs. • Client satisfaction and program outcomes are regularly assessed.
• Results of client outcome assessments and client satisfaction are used.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

81

Evaluative Thinking Client Data

1) Designing programs based on what funders want or only on what is “thought” to be best.
2) Assuming clients happiness = program effectiveness.

3) Collecting but not analyzing client data. 4) Limiting data collection from clients to satisfaction only.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

82

Examples of Evaluative Thinking: Program Development
• Identifying gaps in community services before planning new programs. • Assessing the needs of the target population as part of program planning process. • Using data from needs assessments and/or gaps analyses to inform planning.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

83

Organizations that Regularly use Evaluative Thinking Will Also . . .
• Think carefully about developing and assessing programs.

• Incorporate program evaluation findings into the program planning. • Involve significant others in planning/revising. • Develop written program plans and logic models. • Follow program plans. • Have strategies in place to modify plans
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

84

What Can Organizational Leaders do to Enhance Evaluative Thinking?
• Educate staff about Evaluative Thinking. • Be clear about what it means to take an evaluative approach.

• Set the stage for others by using Evaluative Thinking in your own practice.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

85

Remember Logic Models???
 Inputs: resources, money, staff/time, facilities, etc.

 Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery.
 Outcomes: changes to individuals or populations during or after participation. It’s easiest to embed targets here

 Indicators: Indicators are specific characteristics or changes that represent achievement of an outcome
Inputs Activities Outcomes Indicators

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

86

What strategies are used to collect data about indicators?
 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

87

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

88

Surveys:
 Have a series of questions (items) with pre-determined response choices.
 Can include all independent items or groups of items (scales) that can be summarized.
Can also include some open-ended items for write-in or clarification, Can be completed by respondents or survey administrators, Can be conducted via mail, with a captive audience, on the phone or using the internet, and through a variety of alternative strategies.
Instruments are called surveys, questionnaires, assessment forms
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

89

Use Surveys:
 To study attitudes and perceptions.  To collect self-reported assessment of changes in response to program.  To collect program assessments.  To collect some behavioral reports.
 To test knowledge.  To determine changes over time. Best with big or distant groups, for sensitive information.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

90

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

91

Interviews
 An interview is a one-sided conversation between an interviewer and a respondent.  Questions are (mostly) pre-determined, but openended. Can be structured or semi-structured.  Respondents are expected to answer using their own terms.
 Interviews can be conducted in person, via phone, one-on-one or in groups. Focus groups are specialized group interviews.
Instruments are called protocols, interview schedules or guides
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

92

Use Interviews:
 To study attitudes and perceptions using respondent’s own language.

 To collect self-reported assessment of changes in response to program.  To collect program assessments.  To document program implementation.  To determine changes over time.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

93

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

94

Observations
Observations are conducted to view and hear actual program activities so that they can be described thoroughly and carefully.
 Observations can be focused on programs overall or participants in programs.

 Users of observation reports will know what has occurred and how it has occurred.
 Observation data are collected in the field, where the action is, as it happens.
Instruments are called protocols, guides, sometimes checklists
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

95

Use Observations:
To document program implementation. To witness levels of skill/ability, program practices, behaviors. To determine changes over time.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

96

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

97

Record Review
Review of program records involves accessing existing internal information or information that was collected for other purposes. Data are obtained from:
 a program’s own records (e.g., intake forms, program attendance)  records used by other agencies (e.g., report cards; drug screening results; hospital birth data).

 adding questions to standard record-keeping strategies (e.g., a question for parents about program value can be added to an enrollment form).
Instruments are called protocols. Use requires identification of and access to available information.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

98

Building Evaluation Capacity Session 4
Evaluation Data Collection & Analysis Surveys and Interviews

Anita M. Baker, Ed.D.
Bruner Foundation Rochester, New York

Evaluative Thinking
• Ask important questions before decisions are made, • Systematically collect and analyze data to inform decisions,
• Share results of findings and

• Base responses and actions on the results of analyses (as appropriate).
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

100

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

101

Surveys:
 Have a series of questions (items) with pre-determined response choices.
 Can include all independent items or groups of items (scales) that can be summarized.
Can also include some open-ended items for write-in or clarification, Can be completed by respondents or survey administrators, Can be conducted via mail, with a captive audience, on the phone or using the internet, and through a variety of alternative strategies.
Instruments are called surveys, questionnaires, assessment forms
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

102

Surveys Are Most Productive When They Are:
 Well targeted, with a narrow set of questions  Used to obtain data that are otherwise hard to get.  Used in conjunction with other strategies.
Surveys are best used: with large numbers, for sensitive information, for groups that are hard to collect data from Most survey data are qualitative but simple quantitative analyses are often used to summarize responses.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

103

Surveys can be administered and analyzed quickly when . . .
 pre-validated instruments are used

 sampling is simple or not required
 the topic is narrowly focused  the numbers of questions (and respondents*) is relatively small  the need for disaggregation is limited
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

104

Use Surveys To . . .
 study attitudes and perceptions.
 collect self-reported assessment of changes in response to program.

 collect program assessments.
 collect some behavioral reports.  test knowledge.

 determine changes over time.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

105

Benefits of Surveys
 Surveys can be used for a variety of reasons such as exploring ideas or getting sensitive information.  Surveys can provide information about a large number and wide variety of participants.  Survey analysis can be simple. Computers are not required.  Results are compelling, have broad appeal and are easy to present.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

106

Drawbacks of Surveys:
 Designing surveys is complicated and time consuming.

 The intervention effect can lead to false responses, or it can be overlooked.  Broad questions and openended responses are difficult to use.
 Analyses and presentations can require a great deal of work. You MUST be selective.
Bruner Foundation Rochester, New York

!
107

Anita M. Baker, Ed.D.

Developing/Assessing Survey Instruments
1) Identify key issues.

2) Review available literature. 3) Convert key issues into questions.
4) Determine what other data are needed.

5) Determine how questions will be ordered and formatted. 6) Have survey instrument reviewed.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

108

For Survey Items, Remember:
1) State questions in specific terms, use appropriate language.
2) Use multiple questions to sufficiently cover a topic.

3) Avoid “double-negatives.” 4) Avoid asking multiple questions in one item.
5) Be sure response categories match the question, are exhaustive and don’t overlap.

6) Be sure to include directions and check numbering, format etc.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

109

Types of Surveys:
Mail Surveys (must have correct addresses and return instructions, must conduct tracking and follow-up). Response is typically low. Electronic Surveys (must be sure respondents have access to internet, must have a host site that is recognizable or used by respondents; must have current email addresses). Response is often better.
Web + (combining mail and e-surveys). Data input required, analysis is harder.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

110

Types of Surveys:
Phone Surveys (labor intensive and require trained survey administrators, access to phone numbers, usually CATI software). Response is generally better than mail, but must establish refusal rules.

Staged Surveys (trained survey administrators required, caution must be used when collecting sensitive info). Can be administered orally, multiple response options possible, response rates very high. Intercept Surveys (require trained administrators). Refusal is high.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

111

Sampling
Surveys are not always administered to every member of a group (population). Often, some members, a sample, are selected to respond.
(Additional strategies in manual.)

Convenience Samples.
 Provide useful information to estimate outcomes (e.g. 85% of respondents indicated the program had definitely helped them)
 Must be used cautiously, generalization limited.

Random Samples. Everyone must have equal opportunity.
 Careful administration and aggressive follow-up needed.

 Generalization/prediction possible.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

112

How Many Surveys Do you Need to Administer?
 Identify the population size, desired confidence and sampling error thresholds.  95% confidence with 5% error is common.

With the right sample size you can be 95% confident that the answer given by respondents is within 5 percentage points of the answer if all members of the population had responded.  Use this formula: n=385/(1+(385/all possible respondents)). OR
 Consult a probability table (see manual).

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

113

How Many Surveys Do you Need to Administer?
The sample should be as large as probabilistically required. (Probability – not Percentage)

If a population is smaller than 100, include them all. When a sample is comparatively large, adding cases does not increase precision. When the population size is small, relatively large proportions are required and vice versa.
You must always draw a larger sample than needed to accommodate refusal. Desired sample size ÷ (1-refusal proportion)

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

114

How Can I Increase Response?
Write a good survey and tailor administration to respondents.
Advertise survey purpose and administration details in advance. Carefully document who receives and completes surveys. Aggressively follow-up. Send reminders. Consider using incentives. Make response easy.

Remember: Non-response bias can severely limit your ability to interpret and use survey data.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

115

Calculating Response Rates
Response rate is calculated by dividing the number of returned surveys by the total number of “viable” surveys administered. Desirable response rates should be determined in advance of analysis and efforts should be made to maximize response.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

116

Administration Plans

(see apppendix12)

Before you administer a survey be sure you can answer the following questions!
Who and where are your target groups? Do they require assistance to answer? Which type of survey will be best to use with your target group? How often?
Will the survey be anonymous or confidential?

How much time will be required to respond? How will you analyze the data you expect to collect?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

117

Administration Plans (Con’t.)
 What specific fielding strategy will be used? Will there be incentives?  How will you track the surveys?
 How will you provide ample opportunities for all members of the sample to respond? What response rate is desired?

 Whose consent is required/desired? Will you use active or passive consent?
 How will you store and maintain the confidentiality of the information?

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

118

Preparing for Analysis: Developing Codebooks
 Unless they’re embedded, assign numbers for all response categories.  Write the codes onto a copy of the survey and use for reference.  It is bad practice to re-code data as you go. Prepare for entry as is.  List or describe how data are to be recoded.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

119

What Should Your Survey Analysis Plan Include?
How survey items are related to evaluation overall What analytical procedures including disaggregation will be conducted with each kind of data you collect How you will present results
How you will decide whether data show that targets have been exceeded, met or missed (as appropriate) How you will handle missing data
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

120

Example: Student Survey Analysis Plan
1. The percentages of all students who smoke and those who have recently started will be calculated. 2. The percentage of boys who smoke will be compared to the percentage of girls who smoke.
3. The average age of first alcohol use will be calculated from students’ responses.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

121

Example: Continued Student Survey Analysis Plan
4. The percentage of students who provide positive (i.e., Good or Very Good) ratings for the smoking prevention program will be calculated. Answers for self-reported nonsmokers will be compared to self-reported smokers. 5. The distribution of scores on the likelihood of addiction scale will be determined.
 Only valid percents will be used, items missed by more
than 10% of respondents will not be used. Meeting the target means ±5 percentage points. Far exceeding or missing = +15 percentage points.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

122

Common Factors That Can Influence Responses
Participant Characteristics
• • • • • Age group, gender, race/ethnicity Educational level or type Household income group, household composition Status (e.g., disabled/not, smoker/non) Degree of difficulty of the participant’s situation

Location of program
• Political or geographic boundaries • Program sites • Characteristics of location (e.g., distressed/not)

Program Experience (type or amount or history)

Where appropriate, disaggregate by one or more of these!
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

123

Survey Result Example
Disaggregated Data

Peer Study Group

Total
N=479

% of 2005 Freshman who . . .

Yes n=232

No n=247

Reported struggling to maintain grades Are planning to enroll for the sophomore year at this school

36%

58%

47%

89%

72%

80%

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

124

Survey Result Example
Comparison of Site Outcomes: MS/JHS Only
CAP Site 1 CAP Site 2 CAP Site 3

100% 80% 60% 40% 20% 0%
M A ok To e th A PS T
Bruner Foundation Rochester, New York

an Pl

A lte d re ra G s de

te si Vi

H

H

ed nd tte

a nt ai ed in

h ig t Sa

h ig t In

to

d a le du he Sc

C

Anita M. Baker, Ed.D.

s as Cl

e er

f is

t on

ge lle Co

m t os

st

t ac n io s

ue in

m ra og pr

125

Survey Result Example
Table 3: Relationships between ATOD use and other factors among 9th graders Academically and socially attached to school Have post-secondary aspirations Ever Used Never Used ATOD ATOD 35% 68% 49% 73%

Are passing most classes Were sent to the office during last 2 months Describe their health as excellent
Felt unhappy, sad or depressed recently

86% 23% 30%
32%

94% 10% 42%
12%

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

126

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

127

Interviews
 An interview is a one-sided conversation between an interviewer and a respondent.  Questions are (mostly) pre-determined, but openended. Can be structured or semi-structured.  Respondents are expected to answer using their own terms.
 Interviews can be conducted in person, via phone, one-on-one or in groups. Focus groups are specialized group interviews.
Instruments are called protocols, interview schedules or guides
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

128

Use Interviews:
 To study attitudes and perceptions using respondent’s own language.

 To collect self-reported assessment of changes in response to program.
 To collect program assessments.

 To document program implementation.
 To determine changes over time.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

129

Interviews: Methodological Decisions
 What type of interview should you conduct? (see pg. 28)
    Unstructured Semi-structured Structured Intercept

 What should you ask? How will you word and sequence the questions?  What time frame will you use (past, present, future, mixed)?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

130

Interviews: More About Methodological Decisions
 How much detail and how long to conduct?
 Who are respondents? (Is translation necessary?

 How many interviews, on what schedule?
 Will the interviews be conducted in-person, by phone, on-or-off site?

 Are group interviews possible/useful?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

131

Conducting and Recording Interviews: Before
 Clarify purpose for the interview.  Specify answers to the methodological decisions.  Select potential respondents – sampling.  Collect background information about respondents.  Develop a specific protocol to guide your interview.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

132

Conducting and Recording Interviews: During
 Use the protocol (device) to record responses.
 Use probes and follow-up questions as necessary for depth and detail.  Ask singular questions.  Ask clear and truly open-ended questions.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

133

Conducting and Recording Interviews: After
 Review interview responses, clarify notes, decide about transcription.

 Record observations about the interview.  Evaluate how it went and determine follow-up needs.
 Identify and summarize some key findings.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

134

Tips for Effective Interviewing
 Communicate clearly about what information is desired, why it’s important, what will happen to it.  Remember to ask single questions and use clear and appropriate language. Avoid leading questions.  Check (or summarize) occasionally. Let the respondent know how the interview is going, how much longer, etc.  Understand the difference between a depth interview and an interrogation. Observe while interviewing.  Practice Interviewing – Develop Your Skills!
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

135

More Tips
 Recognize when the respondent is not clearly answering and press for a full response.  Maintain control of the interview and neutrality toward the content of response.  Treat the respondent with respect. (Don’t share your opinions or knowledge. Don’t interrupt unless the interview is out of hand).

 Practice Interviewing – Develop Your Skills!

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

136

Analyzing Interview Data
1) Read/review completed sets of interviews. 2) Record general summaries 3) Where appropriate, encode responses.
4) Summarize coded data 5) Pull quotes to illustrate findings. (see pg 30 for
examples)

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

137

What Happens After Data are Collected?
1. Data are analyzed, results are summarized. 2. Findings must be converted into a format that can be shared with others. 3. Action steps should be developed from findings

Step 3 moves evaluation from perfunctory compliance into the realm of usefulness. “Now that we know _____ we will do _____.”
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

138

Increasing Rigor in Program Evaluation
 Mixed methodologies  Multiple sources of data  Multiple points in time

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

139

Building Evaluation Capacity Session 5
Evaluation Data Collection & Analysis Observation and Record Review

Anita M. Baker, Ed.D.
Bruner Foundation Rochester, New York

What strategies are used to collect data about indicators?
 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

141

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

142

Observations
Observations are conducted to view and hear actual program activities so that they can be described thoroughly and carefully.
 Observations can be focused on programs overall or participants in programs.

 Users of observation reports will know what has occurred and how it has occurred.
 Observation data are collected in the field, where the action is, as it happens.
Instruments are called protocols, guides, sometimes checklists
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

143

Use Observations:
To document program implementation. To witness levels of skill/ability, program practices, behaviors. To determine changes over time.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

144

Trained Observers Can:
 see things that may escape awareness of others  learn about things that others may be unwilling or unable to talk about  move beyond the selective perceptions of others  present multiple perspectives

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

145

Other Advantages
 the observer’s knowledge and direct experience can be used as resources to aid in assessment
 feelings of the observer become part of the observation data

 OBSERVER’S REACTIONS are data, but they MUST BE KEPT SEPARATE

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

146

Observations: Methodological Decisions
 What should be observed and how will you structure
your protocol? (individual, event, setting, practice)

 How will you choose what to see?
 Will you ask for a “performance” or just attend a
regular session, or both? Strive for “typical-ness.”
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

147

Observations: Methodological Decisions
 Will your presence be known, or unannounced? Who
should know?  How much will you disclose about the purpose of your observation?

 How much detail will you seek? (checklist vs.
comprehensive)

 How long and how often will the observations be?

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

148

Conducting and Recording Observations: Before
 Clarify the purpose for conducting the observation  Specify the methodological decisions you have made  Collect background information about the subject (if
possible/necessary)

 Develop a specific protocol to guide your observation

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

149

Conducting and Recording Observations: During
 Use the protocol to guide your observation and record
observation data

 BE DESCRIPTIVE (keep observer impressions separate
from descriptions of actual events)

 Inquire about the “typical-ness” of the session/event.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

150

Conducting and Recording Observations: After
 Review observation notes and make clarifications
where necessary.
 clarify abbreviations  elaborate on details  transcribe if feasible or appropriate

 Evaluate results of the observation. Record whether:
 the session went well,  the focus was covered,  there were any barriers to observation  there is a need for follow-up
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

151

Observation Protocols
Comprehensive
     
Setting Beginning, ending and chronology of events
Interactions Decisions

Nonverbal behaviors
Program activities and participant behaviors, response of participants

 Checklist – “best” or expected practices
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

152

Analyzing Observation Data
 Make summary statements about trends in your
observations
Every time we visited the program, the majority of the children were involved in a literacy development activity such as reading, illustrating a story they had read or written, practicing reading aloud.

 Include “snippets” or excerpts from field notes to
illustrate summary points (see manual pp 38-39)
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

153

Analyzed Observation Data
Many different types of arts activities were undertaken, and personal development was either delivered directly or integrated with arts activities. Of the 57 different combinations of programming at the 10 sites, only 3 included activities that were not wholly successful with their target groups, 2 of those because of mismatch between instructor and the participant group. At all sites, ongoing projects were underway and examples of participant work were readily visible. Teaching artists were demonstrating skills, giving youth opportunities to try the skills, and providing one-on-one assistance as needed.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

154

Evaluation Data Collection

 Surveys

 Interviews
 Observations

 Record/Document Reviews

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

155

Record Review
Review of program records involves accessing existing internal information or information that was collected for other purposes. Data are obtained from:
 a program’s own records (e.g., intake forms, program attendance)  records used by other agencies (e.g., report cards; drug screening results; hospital birth data).  adding questions to standard record-keeping strategies (e.g., a question for parents about program value can be added to an enrollment form).
Instruments are called protocols. Use requires identification of and access to available information.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

156

Use Record Reviews:

To collect some behavioral reports.

To test knowledge

To verify self-reported data.
Bruner Foundation Rochester, New York

To determine changes over time.
157

Anita M. Baker, Ed.D.

Analyzing/Using Record Review Data
 Findings from record review data are usually
determined through secondary analysis.

Example: Attendance data are regularly collected for a program to inform routine program operations. Attendance records are summarized quarterly or annually to inform other stakeholders such as funders about program use.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

158

Analyzing/Using Record Review Data
 Results of record reviews are typically arrayed in tables

or summarized in profiles, or “bullet lists” as frequencies or proportions, or averages (see pg. 16, appendix 10 in the Participatory Evaluation Essentials Guide).

 Like observation data, record review data can be both
descriptive and/or evaluative. -- See pg 16.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

159

Analyzing/Using Record Review Data

 Record review data are commonly combined  for multi-variate analyses
 with other evaluation data to determine relationships

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

160

Collecting Record Review Data
 Review existing data collection forms (suggest
keyed to data collection forms.

modifications or use of new forms if possible).

 Develop a code book or at least a data element list  Develop a “database” for record review data.  Develop an analysis plan with mock tables for record
review data.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

161

Record Review Data: Example
ASAP Participant Outcomes
New York Number Enrollment Goal Enrollment Actual Trn. Completion Goal Trn. Completion Actual Placement Actual (30+) Placement Actual (180+) In-field placement 188 152 97 87 41 83 77 89% 48% 97% 93% 81% % 112 94 48 39 26 37 36 81% 59% 84% 97% 84% Boston Number %

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

162

Record Review Data: Example
Outcome: Delivering Healthy Babies
In Program Number Babies Born Born Healthy* Not Born Healthy* 18 13 5 72% 28% % On Waiting List Number 22 14 8 64% 36% %

*The indicator of a healthy baby is birthweight above 5.5 pounds AND Apgar score 7 Or Above.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

163

Record Review Data: Example
Average Pre and Post Test Scores for Youth Enrolled in Summer Learning Camps
Average Scores Pre Test Reading 22.7 (64%) 29.9 (85%) Post Test 25.2 (72%) 29.7 (85%) Difference + 2.5

Math

-0.2

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

164

What Happens After Data are Collected?
1. Data are analyzed, results are summarized. 2. Findings must be converted into a format that can be shared with others. 3. Action steps should be developed from findings

Step 3 moves evaluation from perfunctory compliance into the realm of usefulness. “Now that we know _____ we will do _____.”
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

165

Increasing Rigor in Program Evaluation
 Mixed methodologies  Multiple sources of data  Multiple points in time

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

166

Building Evaluation Capacity Session 6
Designing Evaluations Putting it All Together

Anita M. Baker, Ed.D.
Bruner Foundation Rochester, New York

Good Evaluation Designs Include the Following
 Summary Information about the program
 The questions to be addressed by the evaluation

 The data collection strategies that will be used  The individuals who will undertake the activities
 When the activities will be conducted  The products of the evaluation (who will receive them and how they should be used)

 Projected costs to do the evaluation
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

168

Increasing Rigor in Program Evaluation
 Mixed methodologies  Multiple sources of data  Multiple points in time

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

169

What else must you think about? Data Collection Management
 Identify

data sources

 Select data collection methods

 Develop and/or test instruments and procedures Develop plans for entering and managing the data Train data collectors
Plan for analysis Plan to monitor the data collection system
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

170

Thinking about . . . . Data Collection Instruments
• Who will you collect data about?
Clients, caregivers, other service providers working with clients, staff, some other group? Who are considered participants of your program? Be sure to clearly specify your eval. target population.

• What instruments do you need?
Surveys, interview guides, observation checklists and/or protocols, record extraction protocols?

• Are there any pre-tested instruments (e.g.,
scales for measuring human conditions and attitudes)?

– If not, how will you confirm validity?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

171

Thinking about . . . . Data Collection Instruments
Keeping in mind things like cultural sensitivity, language and expression:
• Are the instruments you plan to use appropriate for the group you are planning to use them with? • Will responses be anonymous or confidential? • How will you analyze data from instruments you choose?

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

172

Thinking about . . . . Data Collection Procedures
• What are your timelines for data collection?
– When will you administer surveys, conduct interviews, etc. ? – Are pre/post strategies needed? Doable?

• When do you need data?
– Is this the same time that data collectors and subjects are available? – What outcomes are expected by the time data collection is planned? i.e., is this the proper timeframe?

• What is required for data collection approval?
– – – – Institutional review? Active consent? Passive consent? Informed consent?
173

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

Thinking about . . . . Data Entry and Management
• How will you store and maintain the information you collect?
– How much data is expected and in what form? – What procedures are necessary to ensure confidentiality? – Where will the data reside?

• How will you handle data entry?
– Do you have specialty software or can you use readily available programs like Excel to help support your data entry? – Who will actually enter the data and where will it be entered? Are there training needs?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

174

Thinking about . . . . Data Collector Training
• Who will collect the data?
Staff within a program, staff from another program, other agency staff, clients from another program (e.g., youth), volunteers?

• What training do data collectors need?
– Can they administer surveys? – Do they know how to conduct interviews? – Have they been trained as observers for this data collection? – Do they have access to and knowledge about records?

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

175

Thinking about . . . . Data Analysis
• How will you analyze the data you collect?
– How will you handle quantitative data? e.g., frequencies, averages, ranges, distributions? Do you need tables and graphs? Do you know how to make them? – How will you handle qualitative data, e.g., quotes, “snippets,” numerical summaries? – What will you do about missing data? – What influencing factors should you consider? What disaggregation is needed?

• Who (staff, volunteers, consultants) will conduct the analysis and how long will it take? Will they need some additional training? • Are there any additional costs associated with data analysis?
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

176

What are the Components of a Strong Evaluation Report?
* Subject program description.

* Clear statement about the evaluation questions and the purpose of the evaluation. * Description of actual data collection methods used. * Summary of key findings (including tables, graphs, vignettes, quotes, etc.
* Discussion or explanation of the meaning and importance of key findings

* Suggested Action Steps * Next Steps (for the program and the evaluation). * Issues for Further Consideration (loose ends)
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

177

Additional Reporting Tips
 Findings can be communicated in many forms.
* * * * brief memos powerpoint presentations oral reports formal evaluation report is most common

 Think about internal and external reporting.  Plan for multiple reports.  Before you start writing, be sure to develop an outline and pass it by some stakeholders.
 If you’re commissioning an evaluation report, ask to see a report outline in advance.

 If you are reviewing others’ evaluation reports, don’t assume they are valuable just because they are in a final form. Review carefully for the important components and meaningfulness.
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

178

Projecting Level of Effort
LOE projections are often summarized in a table or spreadsheet. To estimate labor and time:
• • •

List all evaluation tasks Determine who will conduct each task Estimate time required to complete each task (including pre-training), in day or half-day increments (see page 42 in Participatory Evaluation Essentials)
Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

179

Projecting Timelines
Timelines can be constructed separately or embedded in an LOE chart (see example pp. 44 – 45 Participatory Evaluation Essentials). To project timelines: • Assign dates to your level of effort, working backward from overall timeline requirements.
• Be sure the number of days required for a task and when it must be completed are in sync and feasible.

• Check to make sure evaluation calendar is in alignment with program calendar.



Don’t plan to do a lot of data collecting around program holidays
Don’t expect to collect data only between 9 and 5

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

180

Budgeting and Paying for Evaluation
• Usually the cost to do good evaluation is equivalent to about 10 – 15% of the costs to operate the program effectively.

•

Most of the funds for evaluation pay for the professional time of those who develop designs and tools, collect data, analyze data, summarize and present findings. Other expense include overhead and direct costs associated with the evaluation (e.g., supplies, computer maintenance, communication, software)

•

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

181

Projecting Budgets
•
•

Determine rates for all “staff” to the project.
Calculate total labor costs by multiplying LOE totals by “staff” rates.

•
• •

Estimate other direct costs (ODC) such as copying, mail/delivery, telephone use and facilities.
Estimate any travel costs. Calculate the subtotal of direct costs including labor (fringe where appropriate), ODC and travel.

• • • •

Estimate additional indirect (overhead) costs, where appropriate, as a percentage applied to the direct costs. Apply any other fees where appropriate Sum all project costs to determine total cost of project. Establish a payment schedule, billing system and deliverables.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

182

Things to Avoid when Budgeting and Paying for Evaluation
• It’s bad practice to assume there is a standard, fixed evaluation cost regardless of program size or complexity.

•

It is dangerous to fund an evaluation project that does not clarify how evaluation funds will be used.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

183

Budgeting and Paying for Evaluation
There are two ways to project evaluation costs:
 Identify a reasonable total amount of funds dedicated for evaluation and then develop the best evaluation design given those resource requirements.

 Develop the best evaluation design for the subject program, and then estimate the costs associated with implementing the design. NEGOTIATE design changes if costs exceed available funds.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.

184

These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation.

They may NOT be sold or redistributed in whole or part for a profit. Copyright © by the Bruner Foundation 2007
* Please see the notes attached to the first slide for further information about how to use the available materials.

Bruner Foundation Rochester, New York

Anita M. Baker, Ed.D.


				
DOCUMENT INFO
Shared By:
Stats:
views:212
posted:4/16/2008
language:English
pages:186