Evaluation: Asking the Right Questions

Document Sample
Evaluation: Asking the Right Questions Powered By Docstoc
					          Evaluation:
Asking the Right Questions
                              &

      Using the Answers


           Presented by Annemarie Charlesworth, MA
 UCSF National Center of Excellence in Women’s Health
                      November 3, 2006
Part 1 - Evaluation Overview

Part 2 - Steps to Program Planning
  and Evaluation

Part 3 - The Logic Model: A Tool
  for Planning and Evaluation
      Part 1 - Evaluation
           Overview
        What is Evaluation?

• Process of collecting information about
  your program in order to make some
  decisions about it.

• Complements program management by
  improving and accounting for program
  effectiveness.
How is Evaluation Helpful?
    •   Gain insight
    •   Change practice
    •   Assess effects
    •   Affect participants
              Gain Insight
• Assess needs, desires, and assets of
  community members.

• Identify barriers and facilitators to service
  use.

• Learn how to describe and measure
  program activities and effects.
           Change Practice
• Refine plans for introducing a new service.

• Characterize the extent to which plans were
  implemented.

• Improve the content of educational materials.

• Enhance the program's cultural competence.
      Change Practice (cont.)
• Verify that participants' rights are protected.
• Set priorities for staff training.

• Make midcourse adjustments for improvement.

• Improve the clarity of health communication
  messages.

• Mobilize community support for the program.
             Assess Effects
• Assess skills development by program
  participants.

• Compare changes in provider behavior over
  time.

• Compare costs with benefits.

• Find out which participants do well in the
  program.

• Decide where to allocate new resources.
       Assess Effects (cont.)
• Document the level of success in accomplishing
  objectives.

• Demonstrate that accountability requirements
  are fulfilled.

• Aggregate information from several evaluations
  to estimate outcome effects for similar kinds of
  programs.

• Gather success stories.
         Affect Participants
• Reinforce program/intervention messages.
• Stimulate dialogue/raise awareness regarding
  health issues.
• Broaden consensus among coalition members
  regarding program goals.
• Teach evaluation skills to staff and other
  stakeholders.
• Support organizational change and
  development.
Types of Program Evaluation
• Goals based evaluation (identifying
  whether you’re meeting your overall
  objectives)
• Process based evaluation (identifying your
  program’s strengths and weaknesses)
• Outcomes based evaluation (identifying
  benefits to participants/clients)
   Type of evaluation depends on
     what you want to learn…
Start with:

1) What you need to decide (why are you
   doing this evaluation?);
2) What you need to know to make the
   decision;
3) How to best gather and understand that
   information!
 Key questions to consider when
  designing program evaluation:
1. For what purposes is the evaluation being
   done, i.e., what do you want to be able to decide
   as a result of the evaluation?

2.   Who are the audiences for the information from
     the evaluation (e.g., funders, board,
     management, staff, clients, etc.)

3. What kinds of information are needed to make
   the decision you need to make and/or enlighten
   your intended audiences?
       Key questions (cont.)
4. From what sources should the information be
  collected (e.g., employees, customers, clients,
  etc.?)
5. How can that information be collected in a
  reasonable fashion (e.g., questionnaires,
  interviews, examining documentation, etc.)
6. When is the information needed (so, by when
  must it be collected)?
7. What resources are available to collect the
  information?
Evaluation should be considered
 during program planning and
       implementation…

     Not just at the end!
 It is not enough to have a goal…

 Goals exist because some action is needed.

 However, you can’t argue an action without a
 deep understanding of the problem.




Problem       Need         Action        Goal
Part 2 - Steps to Program
Planning and Evaluation
     10 Steps to Planning a Program
           (and its evaluation!)
1.   Needs and assets
               Extent, magnitude and scope of problem
               Summary of what’s already being done
               Gaps between needs and existing services
               Community support

2.   Goals and objectives
             Long-term specific to target population
             Link short-term objectives to goals


3.   Defining the intervention/treatment
             program components to accomplish objectives and goals
             one or two activities should support each objective
    10 Steps to Planning a Program
          (and its evaluation!)

4. Developing the program/logic model

5. Choose the type(s) of data collection (i.e.,
  surveys, interviews, etc.)


6. Select your evaluation design (i.e., one group
  pre/posttest vs. comparison pre/posttest)
10 Steps to Planning a Program
      (and its evaluation!)
7. Pilot test tools

8. Collect data

9. Analyze data

10. Report, share, and act on the findings
 Part 3 - The Logic Model: A
   Tool for Planning and
          Evaluation
• Picture of how your organization does its
  work
• Communicates its “rationale”
• Explains hypotheses and assumptions
  about why the program will work
• Links outcomes with activities
      Logic models help you
     chart the course ahead …

Allow you to better understand
• Challenges
• Resources available
• Timetable
• Big picture as well as smaller parts
                  Basic Logic Model
1. Resources/    2. Activities      3. Outputs       4. Outcomes     5. Impact
   Inputs




       Planned Work                               Intended Results




  *From W.K. Kellogg Foundation Logic Model Development Guide
                    Basic Logic Model
Resources           Activities      Outputs            Short and         Impact
                                                       Long-term
                                                       Outcomes
In order to         In order to     We expect that     We expect that    We expect that
accomplish our      address our     once completed     if completed or   if completed
set of activities   problem or      or under way       ongoing these     these activities
we will need the    asset we will   these activities   activities will   will lead to the
following:          conduct the     will produce the   lead to the       following
                    following       following          following         changes in 7-10
                    activities:     evidence:          changes in 1-3    years:
                                                       then 4-6 years:
            Example Logic Model for a free clinic to meet the needs of the growing
                      numbers of uninsured residents (Mytown, USA)

Resources                Activities                     Outputs                      Short and              Impact
                                                                                     Long-term
                                                                                     Outcomes
IRS 501(c)(3) status     • Launch/complete              • # of patients referred     • Memorandum of        • Patient co-payments
• Diverse, dedicated     search for executive           from ER to the               Agreement for free     supply 20% of clinic
board of directors       director                       clinic/year                  clinic space           operating costs
representing             • Board & staff conduct        • # of qualified patients    • Change in patient    • 25% reduction in # of
                                                        enrolled in the
potential partners       Anywhere Free Clinic           clinic/year                  attitude about need    uninsured ER
• Endorsement from       site visit                     • # of patient visits/year   for medical home       visits/year
Memorial Hospital,       • Board & staff conduct        • # of medical               • Change in # of       • 300 medical
Mytown Medical           planning retreat               Volunteers                   scheduled annual       volunteers serving
Society, and United      • Design and implement         serving/year                 physicals/follow-ups   regularly each year
Way                      funding strategy               • # of patient fliers        • Increased # of       • Clinic is a United Way
• Donated clinic         • Design and implement         distributed                  ER/physician           Agency
facility                 volunteer recruitment          • # of calls/month           referrals              • Clinic endowment
• Job descriptions for   and training                   seeking info about           • Decreased volume     established
board and staff          • Secure facility for clinic                                of unreimbursed        • 90% patient
                                                        clinic
• First year’s funding   • Create an evaluation                                      emergencies treated    satisfaction for 5
($150,000)               plan                                                        in Memorial ER         years.
• Clinic equipment       • Design and implement                                                             • 900 patients
• Board & staff          PR campaign                                                                        served/year
orientation process
• Clinic budget



Produced by The W. K. Kellogg Foundation
              S.M.A.R.T.
• Outcomes and Impacts should be:
  –Specific
  –Measurable
  –Action-oriented
  –Realistic
  –Timed
   One size does not fit all!

• Many different types of logic models
• Experiment with models that suit your
  program and help you think through your
  objectives
       Useful for all parties involved
(Funder, Board, Administration, Staff, Participating
         organizations, Evaluators, etc.)

•   Convey purpose of program
•   Show why its important
•   Show what will result
•   Illustrate the actions that will lead to the desired results
    – Basis for determining whether actions will lead to results!
• Serves as common language


Enhance the case for investment in your
 program!
      Strengthen Community
           involvement
• Created in partnership, logic models give
  all parties a clear roadmap
• Helps to build community capacity and
  strengthen community voice
• Helps all parties stay on course or
  intentionally decide to go off-course
• Visual nature communicates well with
  diverse audiences
                Logic Models
Used throughout the life of your program

• Planning
• Program Implementation
• Program Evaluation

May change throughout the life of the program!

   – Fluid; a “working draft”
   – Responsive to lessons learned along the way
   – Reflect ongoing evaluation of the program
  The Role of the Logic Model in
    Program Design/Planning

• Helps develop strategy and create
  structure/organization
• Helps explain and illustrate concepts for key
  stakeholders
• Facilitates self-evaluation based on shared
  understanding
• Requires examination of best-practices research
    The Role of the Logic Model in
      Program Implementation
• Backbone of management plan
• Helps identify and monitor necessary data
• Help improve program
• Forces you to achieve and document
  results
• Helps to prioritize critical aspects of
  program for tracking
 The Role of the Logic Model in
      Program Evaluation


• Provides information about progress
  toward goals
• Teaches about the program
• Facilitates advocacy for program approach
• Helps with strategic marketing efforts
                     References
• Kellogg Foundation
  http://www.wkkf.org/pubs/tools/evaluation/pub3669.pdf
• Schmitz, C. & Parsons, B.A. (1999) “Everything you wanted to know
  about Logic Models but were afraid to ask”
  http://www.insites.org/documents/logmod.pdf
• University of Wisconsin Cooperative Extension
  http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
• CDC Evaluation Working Group
  http://www.cdc.gov/eval/logic%20model%20bibliography.PDF
• CDC/MMWR - Framework for Program Evaluation in Public Health
  http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm
• McNamara, C. (last revision: Feb 16, 1998) “Basic Guide to Program
  Evaluation” http://www.managementhelp.org/evaluatn/fnl_eval.htm

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:10
posted:12/1/2011
language:English
pages:36