PowerPoint Presentation

Document Sample
PowerPoint Presentation Powered By Docstoc
					ENSURING ACCURATE
  FEEDBACK FROM
   OBSERVATIONS
       Great Lakes TURN Forum
       Presentation by:
       Craig D. Jerald
       Break the Curve Consulting
       May 4, 2012
Two Kinds of “Feedback” from Observation

   End of Year Based on MULTIPLE Observations
     Quantitativedata (scores)
     Inform major decisions
       Teacher: Choose PD to improve an aspect of instruction
       Administration: Personnel decisions

   Throughout Year Based on SINGLE Observations
     Qualitative“coaching conversation” during a “post
      conference” with teacher following the observation
     Informs ongoing improvement of practice by identifying:
       Effectivepractices to extend into future lessons
       Less effective practices to improve in future lessons
Post-Observation Feedback Can Be Powerful

   Cognitive Science
     Feedback is critical for improving practice in any field
      of endeavor, from music to sports to professions
   Recent Education Studies
     Taylor   and Tyler study of Cincinnati TES
       Students of mid-career teachers scored significantly better
        the year following teacher’s participation
     Experimental    study of My Teaching Partner
       Substantial impact on secondary students’ performance on
        Virginia assessments year following teacher’s participation
       Equivalent of moving from 50th to 59th percentile
 Post-Observation Feedback Can Be Powerful



“My Teaching
Partner”
professional
development based on
the Classroom
Assessment Scoring
System (CLASS)
observation instrument
Post-Observation Feedback Must Be Accurate

   Inaccurate Feedback Costs Teachers and Students
     If weak practice misclassified as strong, teacher will
      extend the practice into future lessons
     If strong practice misclassified as weak, teacher will not
      be investing precious time/energy most efficiently
   The Challenge:
     How  can school systems ensure that feedback avoids
      major errors in classification and instead reflects accurate
      judgments of practice based on evidence collected during
      a classroom observation?
Knowledge Development Project
   Gates Foundation Partnership Sites
       Atlanta
       Denver
       Hillsborough CO, FL
       Los Angeles CMOs
       Memphis
       Pittsburgh
       Prince George’s CO, MD
       Tulsa
   Additional organizations
       DC Public Schools
       Tennessee
       American Federation of Teachers
       National Institute for Excellence in
        Teaching (TAP System)
       University of Virginia (CLASS)
A New Job for Staff: How to Ensure Success?

   QUESTION: If you gave someone on your team a
    critical but very challenging new job to perform, one
    they had never attempted before, how would you help
    them succeed in it?
     Trainthem well and give them the right tools for the job
     Remove obstacles to success in the field

     Monitor their work, and if necessary solve problems
Ensuring Accuracy: Three Key “Action Areas”

   1) Build Observers’ Capacity
     Provide observers with the necessary knowledge, skills,
      and tools to observe accurately

   2) Create Conducive Conditions
     Even  well-equipped observers can fail if they encounter
      significant obstacles “in the field”

   3) Monitor and Ensure Quality
     Takingthe extra step to audit results in order to identify
      and remediate problems
Building Observers’ Capacity: Training

   Understanding the observation instrument




                                               Trainers show
                                               short video
                                               segments to
                                               illustrate practice
                                               at different
                                               performance
                                               levels
Building Observers’ Capacity: Training

   Typical “flow” of observation training
     Understanding  the observation instrument
     Techniques for collecting appropriate evidence

     Aligning evidence to the observation instrument

     Practicing evidence collection and scoring

     Sources of bias and how to mitigate them

   Lessons learned by school systems
     Collaborative,  with plenty of opportunity for dialogue
     Lots of practice with video-recorded lessons

     Some practice observing “live” lessons
Building Observers’ Capacity: Certification

   Certification assessment following training
     Scoring   “normed” video-recorded lessons
       Memphis:    Certification Committee of teachers, principals, and
        administrators to establish “gold standard” scores for
        certification videos
     Live   observations with trainers or experts
       Hillsborough:  Successfully complete two live observation
        cycles to satisfaction of trainer
     Pass/Fail   only or “conditionally certified” category?
   Periodic re-certification assessments
Building Observers’ Capacity: Tools

   Most important tool: The observation instrument
     Must    support reliable observation through
       Clarityof language
       Descriptors and examples
       Manageable number of dimensions

          testing, gathering feedback from teachers and
     Pilot
      observers, refining observation instrument
   Other tools for capturing and aligning evidence
       PUC    Schools:
             LiveScribe pens
             Evidence Guide
Building Observers’ Capacity: Reinforcement

   Even highly calibrated observers can experience
    “rater drift” over time
   Examples of reinforcement include:
     Deep-dive   trainings for groups of observers focused on
      critical dimensions of instrument
     One-on-one coaching

     Paired observations (video or live)

     Group calibration sessions (video or live)
       P.G.County: Videoconferencing enables large groups of up
        to 40 observers to co-observe and score live lessons
Conducive Conditions: Manageable Caseload
Conducive Conditions: Manageable Caseload
Conducive Conditions: Manageable Caseload

   Decreasing top of ratio (observation workload)
     Decreasing      number of teachers
       Pittsburgh:One-third of experienced teachers per year
        participate in alternative observation system using peer
        observation and coaching
     Decreasing      number of observations
       Hillsborough County: “Proportionate” approach requires
        teachers to be observed from 3 to 11 times per year
        depending on prior year’s evaluation results
     Decreasing  minimum time per observation or number of
      dimensions to be scored
Conducive Conditions: Manageable Caseload

   However, keep in mind this MET Project finding:
Conducive Conditions: Manageable Caseload
Conducive Conditions: Manageable Caseload

   Increasing bottom of ratio (observation capacity)
     Training   and certifying more administrators
       School-level     (assistant principals, etc.)
       District-level
     Training   and certifying a cadre of teacher-leaders
       Hillsborough County: Rotating peer and mentor evaluators
       DC: Permanent master educators
       In both cases, costs less than 2% of overall personnel budget
       Other advantages
             Leadership opportunity for teachers
             Can match observers to teachers’ subject area and grade level
             Feedback from observers with recent classroom experience
       Taylor   & Tyler study of Cincinnati: Net student learning gain
Conducive Conditions: Positive Culture

   Communication: Helping teachers and administrators
    understand observation system and how it can support
    improvement
   Collaboration: Inviting some teachers to help develop or
    select observation instrument or establish “gold
    standard” scores for observers’ certification assessments
   Calibration: Providing all teachers with opportunities to
    reach a deeper understanding of the observation
    instrument so they can begin to calibrate their own vision
    for effective instruction against it
   Coaching & Professional Development: Providing
    teachers with meaningful opportunities to improve on the
    practices measured by the observation instrument
Conducive Conditions: Positive Culture

   Calibration
     TAP   System:
       During  first year, regular PLC meetings focus on
        understanding the observation instrument (TAP Rubric)
       Master and mentor teachers model effective practices in
        observation instrument in PLC meetings and in teachers’ own
        classrooms with teachers’ own students
       Teachers score their own observed lessons using instrument,
        with self scores counting toward final evaluation score
   Coaching & Professional Development
     Hillsborough: Office of Staff Development offers PD
      courses aligned with specific dimensions of instrument
Monitor & Ensure Accuracy

   Analyze data from observations to flag patterns
    that suggest problems with accuracy
     Inter-raterreliability of scores
     Anticipated distribution of scores

     Alignment with other measures such as value-added

   Audit evidence collected by observers to examine
    alignment with scores
   Double score a sample of teachers using impartial
    observers
     Described     in the MET Project reports
      Monitor & Ensure Accuracy: NIET Example


Observers
will “re-
calibrate”
on the
Questioning
dimension
using
normed
videos
Wrapping Around the School Year
Additional Information

   Written report is available available on the Bill &
    Melinda Gates Foundation website at:
    www.gatesfoundation.org/college-ready-
    education/Pages/college-ready-resources.aspx

   Craig Jerald can be reached at:
     (202)232-5109
     craig@breakthecurve.com

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:3
posted:9/29/2012
language:Unknown
pages:25