Docstoc

Are Children Safe in Safe Communities

Document Sample
Are Children Safe in Safe Communities Powered By Docstoc
					        presented at




Evaluation Considerations:
  Measures & Methods
      Shrikant I. Bangdiwala, PhD

         Professor of Research in Biostatistics
         Injury Prevention Research Center
         University of North Carolina at Chapel Hill, USA




 1                     WHO VIP Webinar 2011
Content

 Purpose of evaluation
 Cycle of program planning & evaluation

 Indicators

 Study designs

 Statistical modeling

 Challenges




2               WHO VIP Webinar 2011
What are we ‘evaluating’?

 Actions, programs, activities
 Conducted in a community setting, over a
  period of time
 Aimed at reducing deaths, injuries, and/or
  events and behaviors that cause injuries




3                 Safety 2010 London
                 WHO VIP Webinar 2011
 Example:
 Suwon,
 South
 Korea

  area of
 ‘safety
 promotion’

http://www.phs.ki.se/csp/safecom/suwon2.htm
                4                             WHO VIP Webinar 2011
        Why do we ‘evaluate’?

   To know ourselves what works and if we are doing
    some good
     In performing some activity
     In the community
     In the country
 To convince funders and supporters that their
  investment is worthwhile
 To convince the community about the benefits of the
  multiple activities and actions carried out
        5                WHO VIP Webinar 2011
    Main purposes of evaluation

   Evaluation helps determine:
     How well a program/policy works relative to its
      goals & objectives
     Why a program/policy did or didn’t work, relative
      to planned process
     How to restructure a program/policy to make it
      work, or work better
     Whether to change funding for a program


    6                 NSC Chicago 2010
                       WHO VIP Webinar 2011               6
Methodological complications

   Multiplicities
     Multiple components of a program
     Multiple populations at risk
     Multiple study designs
     Multiple types of effects/impacts/outcomes &
      severities
     Multiple audiences/objectives of ‘evaluation’
     Multiple methods for conducting evaluation

7                    WHO VIP Webinar 2011
                 When should evaluation be
                 considered?
                  Evaluation needs to begin in, and be part of,
                   the planning process…
                  Otherwise, “if you do not know where you are
                   going, it does not matter which way you go,
                   and you will never know if you got there or
                   not!”
                              Lewis Carroll (1872)
                              Alice in Wonderland




                 8                   WHO VIP Webinar 2011          8
Adapted from M. Garrettson
     Types of evaluation depending
          on program phase
   Program                Formative Evaluation
   Planning      How can the program activities be improved before
    Phase                        implementation?


   Program
Implementation
                            Process Evaluation
                   How is/was the program (being) implemented?
    Phase

     Post                    Impact / Outcome
   Program       Did the program succeed in achieving the intended
    Phase                       impact or outcome?
     9              NSC Chicago 2010
                     WHO VIP Webinar 2011                            9
                Cycle of program planning and
                evaluation




                10       WHO VIP Webinar 2011

Adapted from C Runyan
 Identify population & problem
 Surveillance data
 Other needs assessment strategies
       key informant interviews
       focus groups
       surveys
       evaluations of past programs
       literature
       consultation with peers
       other info…
 11                  WHO VIP Webinar 2011   11
                             Identify
                             problem                Define target
Disseminate                & population              audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                           Identify
                                                    resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                     Set goals/
 implementation                  Choose             objectives
                                strategies
        12                  NSC Chicago 2010
                             WHO VIP Webinar 2011                   12
        Define target audience
   To whom is the program directed?
     Whose injuries need to be reduced?
     Who is the target of the program?
         •   at risk persons
         •   care givers (e.g. parents)
         •   general public
         •   media
         •   decision makers


        13                      WHO VIP Webinar 2011   13
Understand target audience

    What are their characteristics?
      Special needs (e.g. literacy)
      Interests, concerns, priorities
      Attitudes & beliefs re: problem & solutions to
       problem
      Cultural issues




14                  WHO VIP Webinar 2011                14
                             Identify
                             problem                   Define target
Disseminate                & population                 audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                              Identify
                                                       resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                        Set goals/
 implementation                  Choose                objectives
                                strategies
        15                      WHO VIP Webinar 2011                   15
Identify resources
    Community partners
      interest in topic
      working on similar projects

 On-going activities
 Sources of financial support

 Interests in community




16                  WHO VIP Webinar 2011   16
                             Identify
                             problem                   Define target
Disseminate                & population                 audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                              Identify
                                                       resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                        Set goals/
 implementation                  Choose                objectives
                                strategies
        17                      WHO VIP Webinar 2011                   17
Set goals & objectives

    Goal
        broad statement of what program is trying to
         accomplish

    Objectives
      Specific
      Measurable
      Time-framed

18                   WHO VIP Webinar 2011               18
                             Identify
                             problem                   Define target
Disseminate                & population                 audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                              Identify
                                                       resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                        Set goals/
 implementation                  Choose                objectives
                                strategies
        19                      WHO VIP Webinar 2011                   19
    Choose Strategies
   Identify existing strategies/programs
      Literature: evidence based? promising practice?
      WHO manuals
      Successes from other communities-regions-
       countries
   Develop new strategies:
      Logic model (how would it work)
      Haddon matrix

    20               WHO VIP Webinar 2011                20
                                 Haddon Matrix

                             Person   Vehicle/               Physical    Social
                                       vector                Environ.   Environ.

         Pre-
         event

         Event


         Post-
         event


                21                    WHO VIP Webinar 2011                     21

Haddon 1970 Am J Public Health
                3-dimensional Haddon Matrix


                                                                       Other??
                                                                     Feasibility
           Event                                                  Preferences
                                                               Stigmatization
                                                              Equity
      Post-event
                                                           Freedom
                                                         Cost
                                                       Effectiveness




               22               WHO VIP Webinar 2011                           22

Runyan 1998 Injury Prevention
                             Identify
                             problem                Define target
Disseminate                & population              audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                           Identify
                                                    resources
Implement

  Test, Refine,        Evaluation:
   Implement           •Formative


  Test & refine                                     Set goals/
 implementation                  Choose             objectives
                                strategies
        23                  NSC Chicago 2010
                             WHO VIP Webinar 2011                   23
           Formative Evaluation

     Questions it answers                      Why it’s useful

   What is the best way to                  Improves (pilot-tests)
    influence the target                      program activities before
    population?                               full-scale implementation
   Will the activities reach the            May increase likelihood
    people intended, be                       program or policy will
    understood and accepted                   succeed
    by target population?                    May help stretch resources
   How can activities be
    improved?
           24                  WHO VIP Webinar 2011                                  24

                                              * Modified from Thompson & McClintock, 2000
                             Identify
                             problem                Define target
Disseminate                & population              audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                           Identify
                                                    resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                     Set goals/
 implementation                  Choose             objectives
                                strategies
        25                  NSC Chicago 2010
                             WHO VIP Webinar 2011                   25
Implementation

    As planned, with attention to detail

    Documented clearly so others can replicate if
     appropriate




26                  WHO VIP Webinar 2011             26
                             Identify
                             problem                Define target
Disseminate                & population              audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                           Identify
                                                    resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                     Set goals/
 implementation                  Choose             objectives
                                strategies
        27                  NSC Chicago 2010
                             WHO VIP Webinar 2011                   27
Process evaluation

    Purpose is to address:
      What was done?
      How was it implemented?
      How well was it implemented?
      Was it implemented as planned?




28                  WHO VIP Webinar 2011   28
    Process evaluation –
    examples of questions

•    Who carried out intervention?
•    Was this the appropriate person/group?
•    Who supported and opposed intervention?
•    What methods/activities were used?




    29             WHO VIP Webinar 2011        29
Process evaluation
- why is it useful?
•    Allows replication of programs that work.
•    Helps understand why programs fail.




30                  WHO VIP Webinar 2011                                  30

                                   * Modified from Thompson & McClintock, 2000
The intervention cannot be a
black box…
                      It must be clearly understood




     Idea        ?                 Outcome




31          WHO VIP Webinar 2011                 31
                             Identify
                             problem                Define target
Disseminate                & population              audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                           Identify
                                                    resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                     Set goals/
 implementation                  Choose             objectives
                                strategies
        32                  NSC Chicago 2010
                             WHO VIP Webinar 2011                   32
Impact evaluation
    Purpose is to address changes in:
        knowledge
        attitudes
        beliefs/ values
        skills
        behaviors / practices



33                   WHO VIP Webinar 2011   33
        Using impact measures for
   Establishing effectiveness
     Suppose we have a public safety campaign as our
      strategy
     Need to show
        Campaign          Behavior               Outcome
       If we already have demonstrated that
         Behavior         Outcome
       We simply need to show
        Campaign          Behavior
        34                WHO VIP Webinar 2011               34
Outcome evaluation

    Purpose is to address changes in:
      injury events (e.g. frequency, type, pattern)
      morbidity (e.g. frequency, severity, type)
      mortality (e.g. frequency, time to death)
      cost (e.g. direct and indirect)




35                   WHO VIP Webinar 2011              35
       Example: Bike helmets
Intervention      Impacts                  Outcomes

                    Parental
                    attitudes
   Physician                               Head injury in
                  toward child
  counseling                               bike crashes
                   helmet use
    parents


Enforcement of    Purchase of
  helmet law       helmets

                                          Deaths from head
Media campaign                            injury in crashes
                 Use of helmets
                  by children

       36         NSC Chicago 2010
                   WHO VIP Webinar 2011                     36
      Evaluation examples of questions
      for local policy of smoke alarms
Did the local policy of smoke alarms in apartments…
   Get passed
   Where people aware of it?
   Did people have access to smoke alarms?
   Did people get them installed properly?
   Do people keep them maintained?
   Lead to a reduction in the number or rates of:
      events (e.g. apartment fires)
     injuries
     deaths
       costs (e.g. burn center costs, family burden, property loss)
     37                     WHO VIP Webinar 2011                     37
Evaluation – selection of
measures
‘Quantitative Indicators’
      Process
      Impact

      Outcome
          Health related
          Financial




38               WHO VIP Webinar 2011
                                        38
         Choice of measure or indicator

We need to choose appropriate impact and outcome measures
   ‘Soft’ (more difficult to measure) outcomes –
        Perceptions constructs: fear, insecurity, wellbeing, quality of life
        Knowledge, attitudes and behaviors constructs
   Hard outcomes –
        Deaths, hospitalizations, disabilities due to injuries and violence
        Societal impacts – local development indicators
   Economics outcomes –
        Direct $/€/£/¥, indirect DALYs, QALYs, opportunities lost, burdens
         39                      WHO VIP Webinar 2011
 Evidence of effectiveness

        Obtain qualitative ‘evidence’ to complement
         the quantitative ‘evidence’
            Ex. Are “multisectorial collaborations and
             partnerships” friendly and functioning well?
            Ex. Is “community participation” optimal?
        Incorporate process indicators
        Incorporate narratives & testimonials

40                      WHO VIP Webinar 2011
                             Identify
                             problem                Define target
Disseminate                & population              audience


                  Evaluation:
                  •Process
                  •Impact
                  •Outcome                           Identify
                                                    resources
Implement

  Test, Refine,      Evaluation:
   Implement         •Formative


  Test & refine                                     Set goals/
 implementation                  Choose             objectives
                                strategies
        41                  NSC Chicago 2010
                             WHO VIP Webinar 2011                   41
     Dissemination
   Dissemination not done well
     Not attempted
     Not based on research about how to disseminate
      information to intended audience
   Dissemination done well
     Defining audience
     How to access audience
     How best to communicate change message to them
     Presentation of clear, straightforward messages

     42               WHO VIP Webinar 2011             42
Evaluation measures


  Lead to evidence of effectiveness
  But only if the research and study
   methodologies, and the statistical analyses
   methodologies, are appropriate to convince
   the funders and supporters, the skeptics,
   the stakeholders, the community
        and understandable

43                 WHO VIP Webinar 2011
                                                 43
       Research methodology approach:
       Evidence of effectiveness
   Obtain quantitative ‘evidence’ that favors the hypothesis that
    the intervention is effective as opposed to the (null) hypothesis
    that the intervention is not effective.
   How?
      Experimental study designs - randomized clinical trials,
       grouped randomized experiments, community-randomized
       studies
      Quasi-experimental study designs - non-randomized
       comparative studies, before-after studies
      Observational studies - cohort studies, case-control studies
       and comparative cross-sectional studies
        44                    WHO VIP Webinar 2011
            Randomized controlled trial
            (RCT) / Experiment

            ‘strongest’ evidence

                     Intervention               O   X    O
                        Group
Randomize



                                                O   X’   O
                      Control
                       Group



            45           WHO VIP Webinar 2011
                                                             45
  Quasi-experimental designs


  ‘qualified’ evidence

                           O                X   O
Intervention Group




Comparison Group           O                    O




  46                 WHO VIP Webinar 2011
                                                    46
One group pre/post


‘weak’ evidence

Intervention    O            X        O
   Group




47             WHO VIP Webinar 2011
                                          47
One group –
multiple pre / multiple post


better ‘weak’ evidence
Intervention   O   O    O                 X   O   O   O   O   O
   Group




48                 WHO VIP Webinar 2011
                                                              48
  One group, post only


  ‘basically ignorable’ evidence

Intervention                  X       O
    Group




  49           WHO VIP Webinar 2011
                                          49
            Observational designs
            - cohort study


            evidence?
                                           X                X   X
    Self-chosen Intervention Group




Self-chosen Non-intervention Group         O                O   O




            50                        Safety 2010 London
                                     WHO VIP Webinar 2011
                                                                    50
Observational designs
- case-control study


evidence?
                  X
                                      Cases
                  O



                  X
                                   Controls
                  O



51           Safety 2010 London
            WHO VIP Webinar 2011
                                              51
Observational designs
- cross-sectional study


evidence?
                                   X     X
                                   O Injured X
                                   O

                                   O   X
                                     Non-injured O
                                   O      O




52           Safety 2010 London
            WHO VIP Webinar 2011
                                                     52
Statistical analysis
methodologies
    Choice - often guided by what has been done
     previously, or what is feasible to do, or easy to
     explain
    Choice should be tailored to the audience &
     their ability to understand results; but also on
     the ability of the presenter to explain the
     methodologies

53                   WHO VIP Webinar 2011
        Statistical analysis
   Determined by research question(s)
   Guided by study design – experimental or observational
       Group randomized controlled experiment
       Non-randomized comparison study
       Single site pre/post; surveillance study
       Retrospective or cross-sectional
   Guided by whether outcome is studied at a single time point or
    multiple time points
       Time series analyses
   Guided by audience
                              appreciation
        Visual and descriptiveWHO VIP Webinar 2011
        54
                Visual and descriptive analysis
                – longitudinal time series

Example:


Espitia et al (2008)
Salud Pública Mexico




                55         WHO VIP Webinar 2011
               Visual and descriptive analysis
               – comparisons over time

Example:


www.gapminder.org




               56          Safety 2010 London
                          WHO VIP Webinar 2011
        Statistical analysis - challenge
 But what we as a field have not done as well as other
  fields, is to draw strength from numbers  develop
  collective evidence
 Combine results from multiple studies
     Systematic reviews (of observational studies)
     Meta analysis (of experimental & observational studies)
     Meta regression (of heterogeneous studies)
     Mixed treatment meta regression (for indirect
      comparisons)
        57               WHO VIP Webinar 2011
     Systematic reviews
   A protocol driven comprehensive review and
    synthesis of data focusing on a topic or on related key
    questions
         formulate specific key questions
         developing a protocol
         refining the questions of interest
         conducting a literature search for evidence
         selecting studies that meet the inclusion criteria
         appraising the studies critically
         synthesizing and interpreting the results
     58                      WHO VIP Webinar 2011
Example –
Systematic review




                         Shults et al (2001) Amer J Prev Med

59        WHO VIP Webinar 2011
Systematic reviews
   Of particular value in bringing together a number
    of separately conducted studies, sometimes with
    conflicting findings, and synthesizing their results.
                        Zaza et al (2001) Amer. J Preventive Medicine – motor vehicle


   To this end, systematic reviews may or may not
    include a statistical synthesis called meta-analysis,
    depending on whether the studies are similar
    enough so that combining their results is
    meaningful          Green (2005) Singapore Medical Journal

60                     WHO VIP Webinar 2011
Meta analysis

    A method of combining the results of studies
     quantitatively to obtain a summary estimate of the
     effect of an intervention
        Often restricted to randomized controlled trials
        Recently, the Cochrane Collaboration is
         ‘branching out’ to include both experimental and
         observational studies in meta analyses



61                     WHO VIP Webinar 2011
                     Meta analysis

   e.g. Liu et al
    (2008)
    Cochrane
    Collaboration




                     62       WHO VIP Webinar 2011
Meta analysis

    The combining of results should take into account:
        the ‘quality’ of the studies
          • Assessed by the reciprocal of the variance
        the ‘heterogeneity’ among the studies
          • Assessed by the variance between studies




63                      WHO VIP Webinar 2011
                   Meta analysis
                   – estimation of effect
          The estimate is a weighted average, where the weight of a study
           is the reciprocal of its variance
          In order to calculate the variance of a study, one can use either a
           ‘fixed’ effects model or a ‘mixed’/’random’ effects model
                Fixed effects model:
                 utilizes no information from other studies  var(Yi )  var(ei )  VYi
                Random effects model:
                 considers variance among and within studies  Yi     i  ei
                                                                         var( )   2
                                                                         var(Yi )   2  VY*i
                   64                             WHO VIP Webinar 2011

Borenstein et al (2009) Introduction to Meta Analysis
Meta analysis & meta regression

   Dealing with ‘heterogeneity’ among the studies - 2
        Decompose the total variance into among and within
         components  using mixed effects models for
         getting a more precise estimate of the intervention
         effect
   If there is still residual heterogeneity
        Expand the mixed effects model to include study-level
         covariates that may explain some of the residual
         variability among studies  meta regression

    65                   WHO VIP Webinar 2011
        Meta regression
      e.g.
              Yi    1 X 1i   2 X 2i   i  ei
                                                          random error

Overall mean                                study random effect

   X1 study variable – EU/USA
                       X2 study variable – population type




        66                         WHO VIP Webinar 2011
       Meta analysis
   Standard meta-analytical methods are typically restricted
    to comparisons of 2 interventions using direct, head-to-
    head evidence alone.
   So, for example, if we are interested in the Intervention A
    vs Intervention B comparison, we would include only
    studies that compare Intervention A versus Intervention B
    directly.
   Many times we have multiple types of interventions for
    the same type of problem, and we hardly have head-to-
    head comparisons
   We may also have multiple component interventions
       67                  WHO VIP Webinar 2011
                 Mixed treatment meta analysis

           Let the outcome variable be a binary response
                1 = positive response
                0 = negative response
           We can calculate the binomial counts r j:k out of a total
            number at risk n j:k on the kth intervention in the jth study
            We can then calculate                rj:k
                                         p      j:k
                                                           n j:k

            the estimated probability of the outcome (risk of response)
            for the kth intervention in the jth study
                 68                 WHO VIP Webinar 2011

Welton et al 2009 Amer J Epid
                 Mixed treatment meta analysis

         Let each study have a reference ‘‘standard’’ intervention arm,
          sj, with study-specific ‘‘standard’’ log odds of outcome, j .
         The log odds ratio, j:k, of outcome for intervention k, relative
          to standard sj, is assumed to come from a random effects
          model with
              mean log odds ratio (dk  ds j ) , and
              between-study standard deviation 
          where dk is the mean log odds ratio of outcome for
          intervention k relative to control (so that d1 = 0).
                  69                  WHO VIP Webinar 2011

Welton et al 2009 Amer J Epid
                 Mixed treatment meta analysis

           This leads to the following logistic regression model:

                                       p j:k       j                    int  s j
                                ln               
                                     1  p j:k     j   j:k             int  k


              where              j:k ~ N[(dk  ds ), ] j




                  70                               WHO VIP Webinar 2011

Welton et al 2009 Amer J Epid
      Mixed treatment meta analysis
      - multiple-methods interventions
   If we have multiple methods in the ith intervention
                      M 1i , M 2i , M 3i ,...

   Plus we have multiple times when the outcome is
    assessed
     Yit     i  1t   2 M 1it   3 M 2it   4 X i  eit

Study effect                 Components 1 & 2 effects
                                                                           Error term
               Time effect
        71                       WHO VIP Webinar 2011   Study covariable
    Statistical analysis

   Methodology does exist for developing stronger
    collective evidence, evaluating the effectiveness of
    community based interventions, using different types
    of study designs and interventions
   Developing “practice-based evidence”




    72               WHO VIP Webinar 2011
    Dissemination

   We should not stop at developing the evidence
   We must work alongside economists in developing
    ways to effectively communicate ‘what works’ 
    methodology and cost models do exist for estimating
    the “return on investment”
   Money talks !!



    73                Safety 2010 London
                     WHO VIP Webinar 2011
        Challenges – Evaluation requires
   Integration of evaluation from the beginning
   Appropriate measures, possible to be collected objectively,
    unbiasedly, easily and with completeness
   Appropriate qualitative and process information, to complement
    the quantitative information
   Concrete and convincing evidence of what aspects work in
    individual communities
   Formal methodological statistical evaluation of specific elements
    of programs
   Collective evidence of what common elements of programs work
   Effective dissemination strategies – “return on investment”
         74                  WHO VIP Webinar 2011
75   WHO VIP Webinar 2011   75

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:25
posted:9/13/2012
language:Unknown
pages:75
Jun Wang Jun Wang Dr
About Some of Those documents come from internet for research purpose,if you have the copyrights of one of them,tell me by mail vixychina@gmail.com.Thank you!