Mixed Methods in Randomized Tria by pengxiuhui

VIEWS: 8 PAGES: 20

									                                   Mixed Methods in
                                   Randomized Trials:
                                   Realizing the Potential,
                                   Avoiding the Pitfalls

             James P. Spillane                     Carol Barnes
             Amber Stitziel Pareja                 University of Michigan
             Northwestern University
                                                   Jason Huff
             Eric Camburn                          Ellen Goldring
             University of Wisconsin-Madison       Vanderbilt University

             Henry May                             Jonathan Supovitz
             University of Pennsylvania            University of Pennsylvania

The Study of School Leadership                                  Funded by:
http://www.thestudyofschoolleadership.com                       Institute for Education Sciences
                                                                National Science Foundation
Overview
    Mixed Methods
             Scarcity of examples of mixed method studies
             Parallel studies

    Describe our efforts to mix qualitative and
     quantitative methods.

    Qualitative Approaches - scenarios, shadowing,
     cognitive interviews, treatment delivery
     observations.

    Quantitative Approaches - principal survey, teacher
     survey, principal logs.
Mixed Methods Research

    Mixed Method Data Analysis Strategies
     (Caracelli and Greene, 1993)

       Data   Transformation

       Typology   Development

       Extreme   Case Analysis

       Data   Consolidation/Merging
Uses of Mixed Methods
Research
 Parallel Mixed Analysis
 Analyze Qualitative Data with
  Quantitative and Qualitative Techniques
 Analyze Quantitative Data with
  Quantitative and Qualitative Techniques
 Sequential QUAN-QUAL Analysis
 Sequential QUAL-QUAN Analysis
 Multi-Step Sequential Analysis
Mixed Methods in PD
Evaluation
 Concurrent Mixed Analysis
     To validate our quantitative research
      instruments
     To better understand key constructs
     To build new measures
     To create typologies for further exploration
Sequential QUAN-QUAL Analysis

 Compared End-of-Day (EOD) and
 shadowing data
     Agreement high
     Principals under-reported building
      operation and finance on EOD
Sequential QUAN-QUAL Analysis

 Analyzed qualitative field notes from two
  principals qualitatively
     Generated typology of possible reasons
      why principals might fail to report activity
               Hypothesis
        Brevity

        Non-Continuous Hypothesis

        Sequencing Hypothesis

        Regularity Hypothesis
Sequential QUAN-QUAL Analysis

 Analyzed observational field notes from
  five principals
     Quantitized field note data
        Coded  for each instance of working
         hypotheses
        Three coders worked independently

     Refined working hypotheses
        Dropped   Regularity Hypothesis
        Articulated new hypothesis – Overshadowing
         Hypothesis
Multi-Step Sequential Analysis

 Analysis provided evidence that the
  brevity, non-continuous, sequencing,
  and overshadowing hypotheses were
  tenable
 Many cases supported two or more
  hypotheses
Sequential –QUAL-QUAN Analysis

 Extending the QUAL-QUANT Analysis or
 QUANT-QUAL Analysis

 QUAL-QUANT-QUAL: Shadowing and
 then cognitive interviews
Types and purposes of classroom visits:
most common to least common
            Drop-in Visits: Monitoring
 Principals looking for student and/or teacher compliance with
   required tasks or instructional goals
       Using rubrics
       Infusing arts into the curriculum
       Goal sheets to track standardized test performance
       Teacher accountability

   The ultimate goal is if the children are on task, if the teachers are
   engaged with the children, their academic achievement is going to
   be higher. So my goal is to really assist the children in their
   academic achievement. And I believe that the sticking the head in
   gives the teachers more motivation to work with the children more.
Drop-in Visits: Visibility

 If monitoring is seeing, then visibility is being
  seen

  I think being visible in the school [is important],
  letting the students know you’re there, letting
  the teachers know that you’re aware of what’s
  going on and are informed about what’s going
  on in the building -- academically and socially
  and behaviorally.
           Principal Survey-Measures of Self-Report Expertise

“To what extent do you currently have personal mastery (knowledge and
understanding) of the following?” (a little, some, sufficient, quite a bit, a great deal)

1. Standards-based Reform              .876
What students should know and be able to do at each grade level in
mathematics
Aligning instruction, assessments and materials

2. Principles of Effective Teaching and Learning .840
Effective instructional practices in mathematics
Evidence-based practices for intervening with struggling students

3. Data-based Decision-making         .866
Different types of assessments
Evidence-based procedures for assessing struggling students

4. Developing a School Learning Environments           .877
Methods for creating learning cultures
Elements of school design

5. Monitoring Instructional Improvement                .829
Benchmarking
Procedures for monitoring teachers
Scenarios
 Four years ago, a new math program was adopted at
 your school. The math program was chosen because
 independent research had shown it to work. Over the
 past few years, math scores on standardized tests have
 not improved significantly. The math scores of poor
 students have decreased slightly.

 Many of your best teachers are convinced that the new
 mathematics program is excellent and should be kept.
 But other teachers are frustrated. A few teachers tell you
 that they think that the math program is at fault. Others
 admit that they are starting to use “whatever works,”
 rather than following the math program.

 Question: How would you address this situation?
Scenarios-Highly competent
Data Based Decision Making
 State testing is one benchmark a school looks at to measure
 strengths and weaknesses. Student progress throughout the
 year on other assessments is just as important. Teachers
 need to look at their own teacher made assessments and
 what they are showing as compared to the state tests. The
 staff has to make connections that the objectives the test
 measures are directly related to the concepts and
 standards that they are teaching so the tests are a good
 way to analyze their students’ understanding. Looking at
 other assessments should give teachers the insight in how to
 plan for their students. Teachers need to be trained to look
 at all the data and plan lessons based on the findings.
 More collaboration in each subject area with an in-depth study
 of the results and looking at the sub areas can give them good
 information. I would provide professional development
 opportunities to allow teachers to find a comfort level with using
 test scores to impact their teaching and planning their lessons.
 (Frequency of Code=8; Rated Perceived competency as 4.5)
Less Competent Data Based
Decision Making
 Provide teachers with a common planning time
 to go analyze the data for their classroom and
 come up with questions for improvement
 among the grade level. They will then need to
 analyze the entire school test data and develop
 a plan for improvement.
 Each teacher will develop their professional
 plan for improvement for the year based on
 their test results. Overall school goals and
 objectives should be designed around the
 needs of students. Continuous assessment of
 student progress and articulation with and
 among teachers will drive student achievement.
 (Frequency of code=3; Perceived Competence
 on data-based decision making 3.5)
Summary of Findings
 We can distinguish between more and less
  knowledge on the scenarios based on
  frequency of mentions
 Responses to scenarios (frequency of
  mentions) and principals self-reports about
  knowledge are not correlated
 There are correlations in the expected direction
  between self-reports of knowledge on the
  principal survey and practices as reported by
  teachers and principals
 There are mostly non-significant correlations
  between the scenarios and teacher and
  principal responses to the survey.
         Analysis
 Second principal indicates reliance on a single data source on
  student achievement to make decisions. In contrast, the first
  principal explicitly points out the need for multiple sources of
  evidence when making decisions noting
 The first principal appears to be aware that different sources might
  offer contrary evidence – “Teachers need to look at their own
  teacher made assessments and what they are showing as
  compared to the state tests.”
 Principal 1 appears to understand that teachers need training in
  order to be able to interpret test data, principal 2 either assumes
  teachers have had this training or that no special training is
  necessary.
 First principal is aware that tests may provide poor evidence of
  students’ understanding if they are not measuring what teachers
  are teaching and it is important for teachers to know this-. The
  second principal says nothing to indicate this sort of knowledge.
Possible Pitfalls

 Different epistemological and ontological
  traditions

 Being cognizant of these traditions is
  critical.

								
To top