�Why Data?�

Document Sample
�Why Data?� Powered By Docstoc
					 “Why Data?”

Putting Data to Work for
 School Improvement
Welcome
 Presenter
     Vicki DeWitt Director Area 5 LTC
     Deb Greaney Area 5 LTC

 Ground Rules..
     Please turn off/vibrate cell phone
     Be fully present
          Please ask questions/offer comments
Until you have data as a backup,
you’re just another person with an
opinion.

            Dr. Perry Gluckman
Data helps make the invisible,
visible.
Get your facts first, then you can
distort them as you please.

Mark Twain
It is a capital mistake to theorize
before one has data.
Insensibly one begins to twist
facts to suit theories, instead of
theories to suit facts.
          Sir Arthur Conan Doyle
Errors using inadequate data are
 much less than those using no
 data at all.
                Charles Babbage
Everyone is entitled to his own
opinion, but not his own facts.

Daniel Moynihan
Data is a lot like humans: It is
 born. Matures. Gets married to
 other data, divorced. Gets old.
 One thing that it doesn't do is
 die. It has to be killed.
                        Arthur Miller
In God we trust. The rest of you
bring your data….
   NCLB


 Stronger accountability for results
 More freedom for states & communities
 Encouraging proven education methods
 More choices for parents


 Making AYP
Our Story-ITS REAL
 NCLB grant
 Data gathered
    SkillsCheck technology pre and post tests
    Teacher Technology Proficiency
    Implementation logs
    Classroom observations
    Collection and examination of
          Pre and post vocabulary tests
          Final Team Products
          Writing samples
          Unit templates
          Standardized test data
Results- Writing Samples
                                             Incorrect Inferences

                             250
 Number of Writing Samples




                                                                        208
                             200

                             150

                             100

                              50       35
                                                          8
                               0
                                                                                                                     Inferences with No Supporting Arguments
                                   Sporadically      Consistently   Not Observed
                                                        Rating                                                 160                                         151




                                                                                   Number of Writing Samples
                                                                                                               140
                                                                                                               120
                                                                                                               100
                                                                                                                80                         63
                                                                                                                60
                                                                                                                           37
                                                                                                                40
                                                                                                                20
                                                                                                                 0
                                                                                                                       Sporadically    Consistently    Not Observed
                                                                                                                                         Rating
Results- Writing Samples

                                                          Summary of Inferences
                                                      (Valid N=345 Writing Samples)
    Number of Writing Samples




                                140                                                                    130
                                120
                                           96
                                100
                                80                                                63
                                60
                                40
                                20                              8
                                  0
                                      No inferences        Consistently   Consistently without Consistently with at
                                                            incorrect         supporting       least one argument
                                                                              arguments
          Results- Writing Samples
                                           Incorrect Conclusions

                            200                                        182
Number of Writing Samples




                            180
                            160
                            140
                            120
                            100
                             80
                             60
                             40
                                      11                15
                             20
                              0
                                  Sporadically      Consistently   Not Observed
                                                      Rating                                                         Conclusions Synthesize/Summarize

                                                                                                              140




                                                                                  Number of Writing Samples
                                                                                                                                                        119
                                                                                                              120

                                                                                                              100
                                                                                                                                        81
                                                                                                               80

                                                                                                               60

                                                                                                               40

                                                                                                               20        8
                                                                                                                0
                                                                                                                    Sporadically    Consistently   Not Observed
                                                                                                                                      Rating
Results- Writing Samples

                                                         Summary of Conclusions
                                                      (Valid N=345 Writing Samples)
   Number of Writing Samples




                               160        140
                               140
                               120                                                     106
                               100                                                                           81
                                80
                                60
                                40                              15
                                20
                                 0
                                     No Conclusions    Consistently Incorrect     Consistently           Consistently
                                                           Conclusions           Conclusions that      Conclusions that
                                                                                Repeat Information   Synthesize/Summarize
Results –Reading Scores
A Few Questions….

 How are your students doing in
  reading and math?
 Are all your students “meeting”?
 Why are some of your kids not
  meeting?
 Why are some of your kids
  exceeding?
Data Everywhere
 Technology’s influence
 Students that are “over tested” but “under
  assessed”
 Testing results are used to compare rather
  than improve performance
 “Tons” of reports generated


BUT………
What Usually Happens….
 Data Collected is …
      Not regular/periodic
      Irrelevant – not useful
      Not used
      Cumbersome
      Not clear
            or
 Data is not collected at all.
Does Your School…

 Deliberately set time aside for reflection on
  actual student work?
 Have a process to ensure that teacher
  reflections and insights are used to modify
  current practice?
 Take action as a result of patterns and trends
  that emerge from the data?
    Reflect
 What data do we currently        What is the most simple
    have? Is it                     and most effective way
    timely/meaningful?              to collect meaningful
   What can we really learn        data? (Do you need a 25
                                    item multiple choice or one
    from this data?                 open ended question?)
   Do we look at it, and if so
    what’s the process?
    Process- how, how often,
    who?
   What do we do once we’ve
    looked at it?
   Do we need to collect
    different data?
Ready, Shoot, Aim



   Factors Effecting
       Student
     Achievement
Barriers to Meaningful Data Use
 Lack of …
   Authentic training
   Meaningful data
   Perceived lack of time
   Understanding the value
   Accountability systems
    that narrow the focus
   Data-Driven Mania
 A high school in an affluent suburb and its principal are
  recognized and financially rewarded by the state
  because their tenth graders scored 11 percent higher on
  the state assessment than tenth graders the year before.
  But the school had done nothing to improve its
  programs; staff readily acknowledged that a particularly
  academically strong group of tenth graders came along
  that year.

 A high-poverty school is labeled as low-performing
  despite the fact the staff worked hard to implement a
  new standards-based mathematics program. The staff
  are deeply demoralized.
   and Trivial Pursuits
 A teacher begins to question where she can continue to
  use an inquiry-based approach to science instruction
  when the state assessment emphasizes science facts,
  not big concepts or inquiry skills.

 A school improvement team decides to provide more test
  preparation in mathematics and to tutor a small number
  of students whose scores on the state assessment fall
  just below the needs-improvement proficiency level.
  There is no discussion of tracking policy; the rigor, focus,
  and coherence of curriculum; or the effectiveness of
  instruction.
    Things to Avoid
 “Bureaucratic Creep”
    Districts adding additional requirements that
     overtax teachers and students but add little useful
     information
 “Rearview Mirror Effect” or planning the future
  on the basis of past events
       Doesn’t allow for responding to a rapidly
       changing reality
      Waits for the road to reveal itself
      Focuses on a single dimension of the road
      Looking back when times were simpler
Some Things to Do
 Identify antecedents
     “structures and conditions that precede,
      anticipate or predict excellence in
      performance”
     Teacher, student, and system


  Understanding of the effect (results)
   requires understanding of the cause
   (antecedent)
              Learning Matrix
              Reflect: Where do you fit now?


                                   LUCKY                           LEADING
                         High results, low understanding High results, high understanding
Achievement of Results




                         of antecedents                  of antecedents
                         Replication of success unlikely Replication of success likely


                                   LOSING                          LEARNING
                                                           Low results, high understanding
                          Low results, low
                                                                   of antecedents
                          understanding of antecedents
                                                           Replication of mistakes unlikely
                          Replication of mistakes likely

                                 Understanding of Antecedents of Excellence
Antecedents of Excellence
 Founded in research
 Examples
     Reinforcement of writing conventions
     Flexibility for teacher management of
      curriculum
     Assessments
     Collaborative scoring of student products
     Meaningful feedback
Some Things to Do
 Institute an accountability system
     action follows analysis
     roles & responsibilities identified
     user-friendly timelines
     power of subtraction

  Accountability includes authority to act and
   permission to subtract
Some Things to Do
 Build a professional, collaborative culture.
     Deliberate effort to build “professional candor”
     Every step of the process requires at least two
      sets of eyes, ears, two brains, and two hearts
     True collaboration occurs only when systems
      are created that embed it in the routine
      processes and provides information that and
      support essential to improve practice
Successful Collaboration
 Collaborative cultures
  promote diversity,
  independence, and
  decentralization
 Collaboration must be
  present from planning to
  execution in DDDM
 Collaboration does not
  mean “one size fits all”
  professional development
Collaboration Antecedents
 Action planning and continuous improvement
  cycles
 Collaborative improvements
 Lesson logs
 Common assessments
 Instructional calendars
 Data teams
 Program evaluation
Staff “Buy In”
 Teachers must see the value in what they are
  asked to do
 You must “subtract” procedures and practices
  that are not producing results

Data that is collected should be analyzed to
 make improvements. If data is not being
 used, stop collecting it!
   Some Things to Do
 Learn what you can from standardized tests
 Use multiple measures, including common grade-level, subject area,
  or course-specific assessment




   A dynamic, local assessment system encompassing multiple
                      measures of assessment
Some Things to Do


 Analyze data on
  multiple levels
 Use/develop common
  classroom level
  assessments



                    Using multiple measures to dig into
                    student learning results
Data Retreats
 Three days in the summer to examine data
 Work in teams
 CESA 7 –Judy Sargent
The Process -- 8 Steps

1. Team Readiness                 Before the   Prep Packet
2.   Collect & Organize Data      retreat
3.   Observe and Analyze Patterns
4.   Pose Hypotheses
5.   Prioritize & Set Improvement Goals During the
6.   Design Study & Strategies            retreat
7.   Define Evaluation Criteria
8.   Make the Commitment &      After the
     Plan the Roll Out
                                retreat
Four Lenses of Data

          Student Data



  Professional            Family &
   Practices             Community
      Data                  Data


            Program &
            Structures
               Data
                                  Broad to Specific
                                  Specific to Patterns
                                  Observing Patterns
     No Excuses

               INFLUENCENCE   CAN’T INFLUENCE


CAN’T COTROL

                  CONTROL
Some Things to Do

 Curriculum mapping/alignment
 Assessment calendar
 Performance assessment
 checklist
    Assessment Calendar Template

  Assessment     Collection Disaggregation Analysis Reflection Recommendation Decision Written Dissemination
                   Date/    Date/Window                           of Changes   Point   Rationale      to
                 Window                                                                          Stakeholders
State
Assessment
Writing
Assessment
ITBS
Common
Assessments
Performance
Assessments
Unit tests
Other
 Scheduled times to collect, aggregate, & disaggregate data
 Required time for analysis, reflection, and recommendations
 Decision points to proceed with status quo or implement change recommendations
 Written rationale for each decision
 Dissemination of rationale driven by data to affected parties



   Beyond the Numbers Making Data Work for Teachers & School Leaders
   Stephen White, Ph.D.
Things to Do
 Identify antecedents
 Institute an accountability system
 Build a professional, collaborative culture
 Use multiple measures, including common
  grade-level, subject area, or course-specific
  assessment
 Reveal the operational curriculum
In Summary
 Expertise in data analysis is the ability to use
  information to solve problems and identify
  solutions consistently, efficiently, and
  effectively
 Teachers and leaders who identify
  antecedents, engage in collaboration, and
  hold themselves accountable for results
  demonstrate expertise in data analysis

Take time to stop and smell (analyze) the
  roses (data).
Thank You!
 Contact info
 vdewitt@lth5.k12.il.us


 dgreaney@lth5.k12.il.us


 www.lth5.k12.il.us

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:4
posted:7/4/2012
language:English
pages:46