Evidence Based Chronic Disease Prevention Module 8 Evaluating The by elizabethberkley

VIEWS: 34 PAGES: 96

									     Evidence Based Chronic
       Disease Prevention
      Module 8: Evaluating
     The Program Or Policy


Presented by: Karen Peters, DrPH
                    Objectives

 Understand the   basic components of
  program evaluation.
 Describe the differences and unique
  contributions of quantitative and qualitative
  evaluation.
 Understand the concepts of measurement
  validity and reliability.


Module One   Evidence Based Chronic Disease Prevention   2
          Develop
         statement
        of the issue

    Determine what
    is known in the                                     Dissemination
        literature



     Quantify the issue                       Evaluate the
                                            program or policy


     Develop program
     or policy options                     Develop an action
                                                 plan
Module One             Evidence Based Chronic Disease Prevention        3
                            Overview
 The “what‟s” and “why‟s” of evaluation
 Study designs and measurement issues
 Types of evaluation
        Quantitative
        Qualitative
 Organizational issues in evaluation
 Infeasible to provide in-depth discussion
  mechanics
        e.g., sampling/instrument development
        Can show basic components in an evaluation, what to
         look for, where to turn for help

Module One          Evidence Based Chronic Disease Prevention   4
               What is Evaluation?

 “A    process that attempts to determine as
    systematically and objectively as possible
    the relevance, effectiveness, and impact of
    activities in light of their objectives.”
      From:  Last JM. A Dictionary of Epidemiology.
        Third Edition. New York: Oxford Press; 1995.

 Complex and         diverse field


Module One       Evidence Based Chronic Disease Prevention   5
             Some questions to consider?

 What    are the most significant challenges
    you face in program evaluation?

 In   program planning when should you begin
    planning an evaluation?




Module One        Evidence Based Chronic Disease Prevention   6
                     Why Evaluate?
 Public health agencies need to be accountable
 Basis for choices when resources are limited
        Helps to determine costs and benefits
 Source of information for improving programs
  and policies
 Increasingly mandated by funders/policy makers
        Convincing funders is not always easy
   Leads to research questions that can be tested in
    other studies/programs

Module One          Evidence Based Chronic Disease Prevention   7
                  New Directions

 Social  programs have become more broad,
    complex and interactive
      Seek  to bring about changes in community
        capacity, social support, decision-making,
        control over resources and individual behavior
 Time    to supplement traditional strategies
    with new approaches reflecting complexity
    of community-based initiatives


Module One       Evidence Based Chronic Disease Prevention   8
              New Directions
 More  flexible evaluation approaches can
  play role in building community capacity
  and self determination
 Need to re-direct program evaluation
  toward community-based public health
  values
 Traditional evaluation conducted by
  „experts‟ to determine if program objectives
  met, strengths/weaknesses/replicability and
  contribution to scientific knowledge
Module One   Evidence Based Chronic Disease Prevention   9
              New Directions

 Some    evaluators believe communities lack
  skills to design, engage in and interpret
  evaluations
 However, „experts‟ may lack insight/
  flexibility needed to capture „essence‟ of
  community projects or to answer questions
  raised by communities and CBOs



Module One   Evidence Based Chronic Disease Prevention   10
              New Directions

 Community-based evaluation perspective
  involves more participatory and inclusive
  process that incorporates the values,
  knowledge, expertise and interests of the
  community and uses evaluation as a tool for
  community capacity building
 Community involved as full/equal partner
  allows for development of more „relevant‟
  program success measures and produces
  data that are useful in community settings
Module One   Evidence Based Chronic Disease Prevention   11
                     New Directions
 Evaluation is   1 part of broader planning
    process but can help in:
      reflecting  on progress
      document where going, where coming from
      share what worked and what did not with other
       communities
      demonstrate need for targeted resources to
       address community issues
      illustrate impact of community based
       initiatives to decision/policy makers
      provide information on developing meaningful
       community based indicators
Module One          Evidence Based Chronic Disease Prevention   12
               Research Phases Model
               (Greenwald and Cullen, 1985)
 Phase      1: Hypothesis Development
      Ex:   Link between sedentary behavior - obesity
 Phase      2: Methods Development
      Pilot  test of intervention to increase physical
        activity and validation of measures
 Phase      3: Controlled Intx Trial (Efficacy)
      Small   scale randomized trial of physical activity
 Phase      4: Defined Population (Effectiveness)
      Larger   scale trial of PA in populations of interest
 Phase      5: Demonstration (Dissemination)
      Evaluation    of results of PA program
Module One         Evidence Based Chronic Disease Prevention   13
                   Evaluation Models

 For        Practitioners:
      Models  identify key factors to consider when
       developing/selecting health behavior programs
      Factors to focus on when reading the literature


 For        Researchers:
      Models   identify important dimensions to be
        included in program evaluations


Module One          Evidence Based Chronic Disease Prevention   14
                Evaluation Models

2    Major Frameworks (there are many others)

 For    Process and Formative Evaluation
     CDC    Framework for Public Health Programs


 For    Impact and Outcome Evaluation
     RE-AIM    Framework



Module One       Evidence Based Chronic Disease Prevention   15
   Framework for Program Evaluation

2     year process by CDC
      designed as a framework for ongoing, practical
       program evaluation
      can be integrated with routine program program
       operations
      involves input from program staff, community
       members, other stakeholders, not just
       evaluation experts
 Involves 6   basic steps and 4 broad
    evaluation standards
Module One      Evidence Based Chronic Disease Prevention   16
          Broad Evaluation Framework
                  (ala CDC)*
     1.      Engage stakeholders
     2.      Describe the program
     3.      Focus the evaluation design
     4.      Gather and analyze evidence
     5.      Justify conclusions
     6.      Ensure use and share lessons learned
     *CDC Framework for Program Evaluation in Public
        Health, 9/17/99 and Center for Advancement of
        Community Based Public Health, June 2000

Module One          Evidence Based Chronic Disease Prevention   17
       CDC Framework for Evaluation

4     Evaluation standards
      Guidelines that  can help assess whether an
        evaluation is well designed
 Utility:      Is the evaluation useful?
      Does   the evaluation answer questions that are
        relevant to the stakeholders
 Feasibility:       Is the evaluation practical?
      Is    the evaluation realistic and cost-effective


Module One          Evidence Based Chronic Disease Prevention   18
       CDC Framework for Evaluation

 Propriety:      Is the evaluation ethical?
      Does   the evaluation consider the rights and
        interests of those involved and affected
 Accuracy: Is        the evaluation correct?
      Do   the evaluation findings convey information
        that is correct and technically adequate




Module One        Evidence Based Chronic Disease Prevention   19
       CDC Framework for Evaluation

 Step       1: Engage the stakeholders
      Stakeholders  - those involved in implementing
        the program and those served or affected by it
        including decision-makers who can do
        something with the results
 Standards for        step 1: Utility and Propriety
      Utility: Haveyou identified individuals and
      organizations affected by the evaluation?
      Are those involved in the evaluation
      trustworthy and competent?
Module One        Evidence Based Chronic Disease Prevention   20
       CDC Framework for Evaluation

 Propriety:
      Is there an explicit, written agreement about
       what is to be done, how, by whom and when?
      Does the evaluation design protect the rights
       and welfare of those involved?
      Are the individuals who are conducting the
       evaluation interacting respectfully with
       stakeholders?
      Have you discussed conflicts of interest openly
       and honestly?
Module One      Evidence Based Chronic Disease Prevention   21
             CDC Evaluation Framework

 Step       2: Describe the program
      summarize    the program being evaluated with a
       statement of need which includes expectations,
       a logic model, specifying resources available to
       conduct program activities and how program
       fits into larger organizational/community
       context
      A good description allows the program to be
       compared to similar efforts and makes it easier
       to figure out what parts brought about what
       effects
Module One        Evidence Based Chronic Disease Prevention   22
             CDC Evaluation Framework

 Standards for        Step 2: Accuracy/Propriety
      Accuracy:  Have you clearly and accurately
       described the program
      Have you documented the program context
      Propriety: Is the evaluation complete and fair
      Does it assess program strengths and
       weaknesses




Module One        Evidence Based Chronic Disease Prevention   23
             CDC Evaluation Framework

 Step       3: Focus the evaluation design
      Specify the evaluations overall intent
      Determine who and what the evaluation is for
       (users and uses)
      What questions the evaluation should examine
      What methods are best to answer the questions




Module One         Evidence Based Chronic Disease Prevention   24
             CDC Evaluation Framework
      Purpose of evaluation depends on program‟s
       stage of development
      New or developing program - feasibility of
       intervention approach
      Implementation - fine tuning or changes needed
      Established program - assess program effects

 Standards for         Step 3:
      Feasibility
      Propriety
      Accuracy
Module One         Evidence Based Chronic Disease Prevention   25
       CDC Evaluation Framework
 Feasibility
     Have  you considered the political
      interests/needs of groups and obtained „buy-in‟
     Does the information produced justify the costs
     Are evaluation procedures practical

 Propriety
     Does evaluation design help to identify needs
     Are costs guided by sound/ethical
      accountability procedures
 Accuracy
     Is   there an accurate description of evaluation
         purposes and procedures Prevention
Module One         Evidence Based Chronic Disease        26
             CDC Evaluation Framework

 Step       4: Gather credible evidence
      Need  well rounded picture of the program
      Develop indicators that translate program
       concepts into specific measures
      Use multiple sources of evidence that reflect
       different perspectives about the program
      Techniques used to gather and handle evidence
       should be compatible with cultural conditions
       in each program setting

Module One         Evidence Based Chronic Disease Prevention   27
             CDC Evaluation Framework

 Standards in        Step 4: Utility and Accuracy
      Utility: Are you collecting information that
       addresses pertinent program issues and is
       responsive to stakeholder needs
      Accuracy: Have you adequately described your
       sources of information
      Do data collection procedures address internal
       validity and reliability issues
      Is there a system in place for identifying and
       correcting errors
Module One        Evidence Based Chronic Disease Prevention   28
             CDC Evaluation Framework

 Step       5: Justify Conclusions
      Involves making    claims about a program based
       on evidence gathered
      Stakeholder values provide basis for making
       judgements about program merits/performance
      Conclusions are based on analysis,synthesis and
       interpretation of information to detect patterns
       and result in recommendations
      Reaching good conclusions requires variety of
       stakeholder perspectives
Module One         Evidence Based Chronic Disease Prevention   29
             CDC Evaluation Framework

 Standards for        Step 5: Accuracy/Utility
      Accuracy:   Has data analysis process been
       effective in answering key evaluation questions
      Can you explicitly justify your conclusions
      Utility: Have you carefully described the
       perspectives, procedures and rationale used to
       interpret the findings




Module One        Evidence Based Chronic Disease Prevention   30
             CDC Evaluation Framework

 Step       6: Ensure use, share lessons learned
      Make  sure stakeholders understand the
       evaluation procedure and findings
      All participants should have the opportunity to
       provide feedback
      Evaluators should provide any needed follow-
       up
      Use a variety of communication strategies to
       disseminate evaluation results in a timely and
       unbiased manner
Module One         Evidence Based Chronic Disease Prevention   31
             CDC Evaluation Framework

 Standards for        Step 6: Utility, Propriety,
    Accuracy
      Utility- do evaluation reports describe the
       program context, purpose, procedures and
       findings clearly?
      Propriety - Have evaluators made sure findings
       (including limitations) are accessible to
       everyone affected by the program
      Accuracy - do evaluation reports reflect the
       findings fairly and impartially?
Module One        Evidence Based Chronic Disease Prevention   32
       RE-AIM Evaluation Framework
              Glasgow, Vogt, Boles, 1999
        Glasgow, McKay, Piette, Reynolds, 2001


 Reach,   Efficacy or Effectiveness (depending
    on research phase), Adoption,
    Implementation, Maintenance

 RE-AIM       relies on 2 comprehensive models
      PRECEDE-PROCEED       (Green & Kreuter 1999)
      Diffusion Theory (Rogers 1995; Nutbeam 1996)


Module One      Evidence Based Chronic Disease Prevention   33
             Dimensions of RE-AIM
 Reach
    Individual Level
    What  % of potentially eligible participants will
     take part
    How representative are they?

 Efficacy or   Effectiveness
    Individual Level
    What was the impact on all who began?
    What was the impact on intermediate & primary
     outcomes?
    What was the positive/negative (unintended)
     outcomes, including quality of life?
Module One       Evidence Based Chronic Disease Prevention   34
            Dimensions of RE-AIM
 Adoption
   Setting Level
   What % of settings/intervention agents will
    participate? (worksites, schools, educators,
    nurses)
   How representative are they?

 Implementation
   Setting    or Agent Level
    To what extent were the intervention components
       delivered as intended (in the protocol), when
                                                 by non
       conducted in applied settingsPrevention researchers 35
Module One        Evidence Based Chronic Disease
             Dimensions of RE-AIM

 Maintenance
      Individual and     Setting Levels

      Individual Level:What are the long term
        effects (minimum 6-12 months following
        intervention)?

      Setting   Level: To what extent are different
        intervention components continued or
        institutionalized?
Module One        Evidence Based Chronic Disease Prevention   36
  Common Challenges and Suggested
 Strategies Using RE-AIM Framework
 Reach
      Challenge: Not including a relevant, high risk,
       or representative sample
      Strategy: Use population-based recruitment or
       over sample high risk groups, reduce exclusion
       criteria
 Efficacy or    Effectiveness
      Challenge: Ambiguous outcomes
      Strategy: Assess broader set of outcomes,
       conduct subgroup analyses, use different
       assessment points
Module One      Evidence Based Chronic Disease Prevention   37
  Common Challenges and Suggested
 Strategies Using RE-AIM Framework
 Adoption
     Challenge:  Program never adopted or endorsed -
      used only in academic settings
     Strategy: Involve participants in all phases,
      approach numerous settings early on while
      revision is still possible
 Implementation
     Challenge:  Protocols not delivered as intended
      (Type III error)
     Strategy: Assess treatment as too complicated,
      intensive, incompatible; involve non-researchers
Module One      Evidence Based Chronic Disease Prevention   38
  Common Challenges and Suggested
 Strategies Using RE-AIM Framework
 Maintenance
      Challenge: Program         or effects not maintained
       over time
      Strategy: Include maintenance phase in
       protocol and evaluation plans; Leave treatment
       behind after study and plan for
       institutionalization




Module One      Evidence Based Chronic Disease Prevention     39
    Common Study/Evaluation Designs
      Experimental/randomized


      Quasi-experimental
             • Time-series
             • Use of existing data




Module One             Evidence Based Chronic Disease Prevention   40
             Study/Evaluation Designs

 Quasi-experimental
      Increasing attention
      At least one intervention and one comparison
       group, without randomization
      Appeal of intervening thru intact social groups
      See Koepsell chapter in readings (in Module 4)




Module One       Evidence Based Chronic Disease Prevention   41
              Study/Evaluation Designs

 Use        the best designs feasible
      Pre- post-data
      Comparison groups
      Complete program records
      Reliable and valid measures
      Proper analytic techniques
      Review principles/tools from Goodman, page
       39 (in Module 8)


Module One         Evidence Based Chronic Disease Prevention   42
                 Challenges in Community:
                       Wide Studies
 Varying degrees   of intervention “exposure”
 Running programs in multiple locations
 Accounting for community-level variance
 Lack of sensitivity of the “community”
      Concepts          of participatory research
             •   Equity, collective decision making
             •   High-quality, ethical research
             •   Addressing social inequalities
             •   Maximize learning opportunities
      See        Goodman article on community capacity
Module One               Evidence Based Chronic Disease Prevention   43
              Challenges in Community:
                    Wide Studies
 Community-level                     variance
      Individuals in communities, neighborhoods,
       schools, worksites are correlated
      ICC (intra-class correlation coefficient)
             • People have related characteristics (not
               independent)
      In  practical terms, means increased
        chance of Type I error (saying there is a
        difference when there really is not)


Module One            Evidence Based Chronic Disease Prevention   44
              Measurement Issues
   Components of a “good” evaluation
        Adequate sample size
        High validity
        High reliability
   Sample size considerations
      Number of communities
      Number of individuals per community
      Increasing number of communities versus number
       of individuals per community
      Can rely on simple, accessible programs like Epi
       Info

Module One        Evidence Based Chronic Disease Prevention   45
        Concepts of Validity and
    Reliability and their Importance:
 Measurement                  Issues
      Evaluation         “threats”:
             • Validity
             • Is the instrument or design measuring exactly
               what was intended? (Self report vs.... biologic
               test)
      Reliability
             • Is the measurement being conducted
               consistently? (Face to face vs...... telephone,
               different interviewers)

Module One            Evidence Based Chronic Disease Prevention   46
               Measurement Issues

   Validity: best available approximation to the
    “truth”
 Internal     Validity
      The   extent of causality (the effects are
        really attributable to the program)
 External      Validity
      The extent of generalizability
      Importance?? (so what???)



Module One       Evidence Based Chronic Disease Prevention   47
             Measurement Issues:

 Major      threats to validity*
      Low  statistical power
      Violated assumptions in statistical tests
      Reliability of measures
      Reliability of treatment implementation
      Random confounders in the experiment
      Random heterogeneity of respondents
     * adapted from Cook and Campbell, 1979


Module One       Evidence Based Chronic Disease Prevention   48
                 Measurement Issues:

 Reliability (repeatability)
   Consistency in measurement
   Multiple types
             • Inter-observer
             • Test-retest
             • Internal consistency




Module One            Evidence Based Chronic Disease Prevention   49
    Measurement Issues: Examples

 Validity
      Self-reported  rate of having a health
        professional check for hemoglobin „A1C‟
        among diabetics in an intervention program
        compared with clinic records
 Reliability
      Test-retest   data from the BRFSS on self-
        report of seeing a health care professional
        in the last year for diabetes among
        diabetics in Illinois
Module One       Evidence Based Chronic Disease Prevention   50
     Measurement Issues: Ensuring
       Validity and Reliability in
               Evaluation
 Literature/contacting  researchers may
  show you accepted methods
 Multiple statistical methods available to
  report validity and reliability
 Evaluation instruments often need
  community contouring
      Participatory   methods may prevent use of
        existing instruments/questions


Module One      Evidence Based Chronic Disease Prevention   51
             Four Purposes of Evaluation

 Needs        Assessment

 Process


 Impact


 Outcome


Module One        Evidence Based Chronic Disease Prevention   52
             How do you decide?

 Use  data available
 What does the data tell you?
 What other information do you want?
 What other programs are already available
  in different locations?
 Reality constraints: Staffing, resources, time




Module One    Evidence Based Chronic Disease Prevention   53
                  Needs Assessment

 Diagnostic Evaluation
 Context Evaluation


      feedback   on knowledge, attitudes, risk
        behaviors, health status, and perceived needs of
        the target population and of the status of
        available health promotion programs




Module One        Evidence Based Chronic Disease Prevention   54
                Process Evaluation

 Formative evaluation


      feedback   on program implementation; site
        response; participant response including
        appropriateness of materials, methods,content;
        practitioner response, and personnel
        competency




Module One        Evidence Based Chronic Disease Prevention   55
      Example: IL Arthritis Awareness

3     components:
      brochure alone
      brochure withcommunity outreach
      brochure, community outreach, physician
       education


 What  would you want to know about each
    component?

Module One      Evidence Based Chronic Disease Prevention   56
               Types of Evaluation:
                Process evaluation
   “Field of Dreams” evaluation
        “If you build it, will they come?”
 Shorter-term feedback on program
  implementation, content, methods, participant
  response, practitioner response
 What is working, what is not working
 Uses quantitative or qualitative data
 Data usually involves counts, not rates or
  ratios


Module One          Evidence Based Chronic Disease Prevention   57
                   Types of Evaluation:


1.   Considerations for process evaluation
     1. Sources of data
             • Program data
     2. Limitations of data
     3. Time frame
     4. Availability

     Example
             • Number of diabetics being obtaining foot
               examinations statewide through primary care
               providers
Module One           Evidence Based Chronic Disease Prevention   58
                     Impact Evaluation

 Summative evaluation
 long    or short term feedback on knowledge,
    attitudes, beliefs and behavior change
    (KABBBs) of participants (skills
    development), programs and policies of
    organizations and governments
      Time 1 - decide what to include in intervention
      Time 2 - did the intervention make a difference
             • Conduct survey or interview participants

Module One             Evidence Based Chronic Disease Prevention   59
               Impact Evaluation:

 Uses       quantitative or qualitative data

 Probably   more realistic endpoints for
    most public health programs and
    policies




Module One       Evidence Based Chronic Disease Prevention   60
                    Impact Evaluation:

1.   Considerations for impact evaluation
     1. Sources of data
             • Surveillance or program data
     2. Limitations of data (validity and reliability)
     3. Time frame
     4. Availability
     Example
            Eye examination rates of diabetics in
             Illinois


Module One           Evidence Based Chronic Disease Prevention   61
             Types of Evaluation:

 Outcome       evaluation
      Long-term  feedback on health status,
       morbidity, mortality, disability or quality of
       life
      Uses quantitative data
      Present throughout integrated strategic
       plan
      Also called summative evaluation



Module One       Evidence Based Chronic Disease Prevention   62
                   Types of Evaluation:

1.   Considerations for outcome evaluation
     1. Sources of data
             • Routine surveillance data
     2. Limitations of data (validity and reliability)
     3. Time frame
     4. Availability
             • IDPH IPLAN website
     Example
             • Rates of end stage renal disease among
               diabetics in Illinois

Module One            Evidence Based Chronic Disease Prevention   63
             Types of Evaluation:
             Outcome evaluation
 Some move toward standardizing (outcome)
  indicators—e.g., CDC‟s consensus
  indicators
 What if yours is not on the “list”?
     1. Race-ethnicity-specific infant mortality rate
     2. Motor vehicle crash death rate
     3. Work-related injury death rate
     4. Suicide rate
     5. Lung cancer death rate
     6. Breast cancer death rate
     7. Cardiovascular disease death rate
     8. Homicide rate
Module One        Evidence Based Chronic Disease Prevention   64
                 Types of Evaluation:
                 Outcome evaluation
     9.      All-cause mortality rate
     10.     AIDS incidence
     11.     Measles incidence
     12.     Tuberculosis incidence
     13.     Syphilis incidence
     14.     Incidence of low birth weight
     15.     Births to adolescents
     16.     Prenatal care
     17.     Childhood poverty
     18.     Proportion of persons living in counties
             exceeding EPA standards for air quality


Module One           Evidence Based Chronic Disease Prevention   65
             Types of Evaluation:
          Outcome/impact evaluation
     Common public health data sources
     1.  Vital events registries
     2. Disease specific registries
     3. Population and house census
     4. Routine health services records
     5. Health programs delivery records
     6. Hospital discharge data
     7. Disease notification within surveillance systems
     8. Sample surveys within surveillance systems
     9. Other sample surveys
     10. Other data banks from programs outside the
         health sector
Module One       Evidence Based Chronic Disease Prevention   66
  Data Questions for Consideration

 What  type of evaluation is this?
 How would you analyze these data?
 What would be your initial conclusions?
 What other data might support or
  detract from your conclusions?
 What are some limitations of your risk
  factor data?


Module One   Evidence Based Chronic Disease Prevention   67
             Types of Evaluation


     Outcome
                              “Doing the Right Things”
     Impact
     (performance measures)


     Process                     “Doing Things Right”



Module One     Evidence Based Chronic Disease Prevention   68
                  Evaluation Framework
                                                       Program
                                                       - Instructors?
                                                       - Content?
                                 Process               - Methods?
                                                       - Time allotment?
                                                       - Materials?
                                                       Behavior
                                                       - Knowledge gain?
      Evaluation                 Impact                -Attitude change?
      Types                                            -Habit change?
                                                       -Skill development?
                                                       Health
                                                       - Mortality?
                                 Outcome               -Morbidity?
                                                       -Disability?
     (Adapted from Green et al, 1980)                  -Quality of life
Module One               Evidence Based Chronic Disease Prevention           69
                 Evaluation Polarities



             Formative                              Summative

             Qualitative                            Quantitative

             Process                                Outcome
                               Impact




Module One             Evidence Based Chronic Disease Prevention   70
               Types of Evaluation

 Quantitative        versus qualitative methods
      Avoid   choosing one or the other
 Generally    more familiarity with
    quantitative methods




Module One       Evidence Based Chronic Disease Prevention   71
             Quantitative vs.... Qualitative

 Quantitative: (numbers) surveillance data,
    surveys, records

 Qualitative: (words)    observations, in-depth
    individual interviews, group interviews,
    focus groups, diaries




Module One         Evidence Based Chronic Disease Prevention   72
             Why Qualitative?

 When   there are no quantitative instruments
  available
 When you are not sure if the measures are
  appropriate for the population you are
  working with
 When you are not sure if you are asking the
  right questions
 When you want to understand program
  processes not just impact and outcomes
Module One   Evidence Based Chronic Disease Prevention   73
               Why Qualitative?

 When    you want more in depth information
    about program implementation -
      What  do clients experience,
      what services are provided to clients,
      How is the program organized?
      What do staff do? How has the program
       developed?
 When    you want detailed, descriptive
    information about the program for the
    purpose of improving the program
Module One      Evidence Based Chronic Disease Prevention   74
              Observation Dimensions

 Role       of observer:
      full participant, partial participant,
        onlooker/observe as outsider
 Covert       or overt:
      Do     others know you are observing?
 Duration of         observation:
      Short    term vs..... long term
 Focus       of observation:
      narrow,    single component of program vs.....
        whole
Module One          Evidence Based Chronic Disease Prevention   75
      Types of Questions You Can Ask

 Behavioral/experience
 Opinion/value
 Feeling
 Knowledge
 Sensory
 Demographic/background
 (ALL from    past, present, future)

Module One    Evidence Based Chronic Disease Prevention   76
             In Depth Interviewing: Types

 Informal conversational interview


 General interview              guide

 Standardized open-ended interview




Module One         Evidence Based Chronic Disease Prevention   77
    Informal Conversational Interview

 No  interview guide; no predetermined set of
  questions
 Particularly useful when do not know what
  types of questions to ask and when yo will
  be in the environment for some time
 Each interview builds on the previous one -
  flexibility and responsiveness to previous
  experiences and the environment

Module One   Evidence Based Chronic Disease Prevention   78
    Informal Conversational Interview

 Requires  a long time to get systematic
  information
 Open to interviewer effects
      tired
      conversational skills
      capacity   to build rapport




Module One         Evidence Based Chronic Disease Prevention   79
             Interview Guide

 Listof questions or issues to be explored - a
  framework
 Can get similar information from several
  individuals
 General topics and probes
 General outline to discuss on predetermined
  subject
 Maintain conversational quality

Module One   Evidence Based Chronic Disease Prevention   80
      Standard Open-Ended Interview

 Set of carefully worded and arranged
  questions
 Less flexibility of issues to explore or
  probing
 Minimizes variation in questions asked,
  thus reduces bias
 Useful when have several interviewers



Module One   Evidence Based Chronic Disease Prevention   81
    Focus Group and Group Interviews
 Relatively homogeneous group
 6-10 people
 Semi-structured
 Focus group - build on each others ideas -
  advantage of group process…can see
  influence of social networks on issue at
  hand
 Group interview - several people together,
  not necessarily take advantage of group
  process
Module One     Evidence Based Chronic Disease Prevention   82
             Wording of Questions

 Open  ended - do not presuppose dimensions
  of feeling, thoughts, or experiences
 Clear - use appropriate language, reflect
  back language you hear
 Singular questions - can preface with going
  to ask about strengths, weaknesses, likes,
  dislikes - then ask each individually
 Neutrality re: subject and rapport with
  individual
Module One     Evidence Based Chronic Disease Prevention   83
             Wording of Questions

 Acknowledge multiple       experiences - can
    provide illustrations or examples [some
    have said this, others that, still others have
    different opinions - what do you think?]

 Avoid leading        questions




Module One      Evidence Based Chronic Disease Prevention   84
                              Probes

 Can        you give me an example of that?

 Can    I stop you here for a minute…can you
    tell me a bit more about x [can re-direct
    conversation]

 Does    anyone else have other experiences
    with this?

Module One         Evidence Based Chronic Disease Prevention   85
              Recording Data

 Permission and  consent
 Notes - during and after the interview;
  include your feelings [tired/excited; how did
  the interview seem to you?]
 Note surroundings
 Tape recording - make sure it is on, make
  sure it works


Module One   Evidence Based Chronic Disease Prevention   86
                         Sampling
 Purposeful sampling   - notion of gaining
    information from those who have it
      extreme or deviant cases - outstanding
       successes or notable failures
      maximum variation - look for common themes
       across multiple participants or programs - small
       sample, great diversity
      homogenous samples - individual or focus
       groups



Module One       Evidence Based Chronic Disease Prevention   87
                           Sampling

 Snowball or         chain sampling
      recommended         informants - who else should I
        speak with


 Criterion sampling
      all   cases who meet some predetermined criteria




Module One         Evidence Based Chronic Disease Prevention   88
                      Analysis

 From  transcription and notes
 Focused coding - with predetermined
  categories in mind
 Open coding - categories and themes from
  the data itself
 Multiple coders
 Label so you can go back to context
 Triangulation

Module One   Evidence Based Chronic Disease Prevention   89
             Reliability and Validity
 Neutrality rather         than objective and
   subjective
     impartial,   not predisposed toward certain
       findings ahead of time, no predetermined
       results to support
 Notion of  generalizability vs....
   extrapolation
     generalizable     - impossible because nothing is
         context free
      extrapolation - speculations on the likely
         applicability of findings to other situations
                   Evidence Based Chronic Disease conditions
         under similar, not identical, Prevention
Module One                                                     90
   Other Considerations in Qualitative
              Evaluation
 Involve stakeholders in development of
  program objectives and evaluation
  questions
 Evaluation requires clear program
  objectives
 Measure program processes, impacts, and
  outcomes using measures appropriate for
  the questions asked
      Expect   frustration from those collecting data,
        resistance from those feeling judged
Module One        Evidence Based Chronic Disease Prevention   91
    Organizational Issues/Summary
   Program personnel may be threatened by the
    evaluation
        Need to maintain objectivity
        Involvement may reduce resistance but may
         threaten objectivity
 Trade-off of comprehensive evaluation versus
  nothing at all
 Think of “10% Rule” as you design and
  implement programs



Module One        Evidence Based Chronic Disease Prevention   92
    Organizational Issues/Summary
        “Howcan I do evaluation when there‟s
       so much „real work‟ to do?”
      “Independent” (outside) evaluation
       may be useful
      What to look for in a “good”
       evaluation
      Remember the use of multi-
       disciplinary teams


Module One      Evidence Based Chronic Disease Prevention   93
             A Word on Coalitions and
                  Partnerships
 Seems   to be an interest in and emphasis on
  the use and effectiveness of coalitions as
  part of public health programming
 Little is known currently about „impact‟ and
  „outcome‟ measures for coalitions
 More is known about „process‟ of coalitions
 Both qualitative and quantitative measures
  are needed in this area

Module One       Evidence Based Chronic Disease Prevention   94
         Other Resources on Evaluation
 World Wide Web Virtual Library: Evaluation
    http://user.berlin.de/~alfio.cermi/index.html
 CDC Evaluation Resources
 http://www.cdc.gov/eval/over.htm
 CDC Framework for Program Evaluation in Public Health
    http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1/
      htm
 Center for the Adv. of Comm Based Public Health
  (www.cbph.org)
 University of Toronto (http://www.utoronto.ca/shp/hcu)
 Univ.of WI Ext http://www.uwex.edu/ces/pdante/evaluat/htm
 Kellogg Fndt http://www.wkkf.org/Publications/evalhdbk/


 Module One      Evidence Based Chronic Disease Prevention   95
               Course Objectives
   Construct a concise, measurable statement of a
    public health issue or statement
   Determine what is known in the scientific
    literature
   Describe public health issues using quantitative
    data sources
   Generate public health program or policy options
   Create public health program or policy action
    plans
   Construct a logic model
   Evaluate a public health program
Module One      Evidence Based Chronic Disease Prevention   96

								
To top