Docstoc

Teachers Attitudes Towards Information Technology

Document Sample
Teachers Attitudes Towards Information Technology Powered By Docstoc
					North Carolina Department of Public Instruction
NC State University/Friday Institute for
Educational Innovation
SERVE Center at the University of North Carolina-
Greensboro
SETDA
Quasi-Experimental,
Experimental, and Case Studies


North Carolina State University/
Friday Institute for Educational
Innovation
  Comprehensive Multi-method Evaluation Design

                                           Case Study




Quasi-Experimental
Design




                     Experimental Design
Quasi-Experimental Design: Process

Quasi-experimental approach:
     A matched-groups, mixed
      between/within longitudinal design
     In 2003, 11 comparison schools were
      selected based on
        Grade structure
        Geographical proximity

        01-02 End of Grade (EOG) Scores

        Student Demographics

        Size

     For all variables, hypotheses were
      framed as: 1d - 1a > 2d - 2a
Quasi-Experimental Design:
Variables and Measures (selected)
        Variables              Measures
   Student achievement    EOG tests
   Student technology     Tech skills surveys
    skills                 NETS-T survey
   Teacher tech skills
                           TAC and TAT
   Attitudes toward        surveys
    technology
   Instructional          AOI survey
    strategies             SOC questionnaire
   Stages of Concern      Leadership
   Leadership Style        Practices Inventory
   Implementation of      STNA
    model
   Quasi-Experimental Design:
   IMPACT and Comparison Student Demographics,
   2005-06


   70.0%

   60.0%

   50.0%

   40.0%

   30.0%

   20.0%

   10.0%

    0.0%
                                                       Amer.            Mulit-
              Male    Female   White   Black   Hisp.            Asian            LEP    Migrant    EC     FRL
                                                       Indian           Racial
 IMPACT       50.7%   49.3%    45.5%   48.6%   2.8%    0.2%     0.6%    2.2%     1.9%    0.3%     14.6%   66.4%
 Comparison   50.5%   49.5%    33.8%   55.8%   7.0%    0.5%     0.5%    2.3%     2.7%    0.2%     21.5%   64.4%



Source: DPI 2005-06 School Report Cards,
  http://www.ncreportcards.org/src/.
Quasi-Experimental Design:
Teacher Quality Statistics
  Teacher Quality    School Year   IMPACT   Comparison   State Avg.
  % Fully Licensed    2002-03       82.6       76.0        84.0
                      2003-04       86.0       85.1        85.0
                      2004-05       88.6       87.8        88.8
                      2005-06       90.6       92.6        88.3
  % Emergency         2002-03        4.3       5.6          NA
  License
                      2003-04        2.6       3.7          NA

                      2004-05        3.4       3.4          5.2
                      2005-06        2.1       2.3          3.3
  % Lateral Entry     2002-03        9.5       8.7          NA
                      2003-04        8.5       8.2          NA
                      2004-05        6.2       5.9          6.5
                      2005-06        5.2       3.9          5.8
  % Classes Taught    2002-03       80.8       86.8        83.0
  by HQT
                      2003-04       87.6       86.3        85.0
                      2004-05       89.3       87.2        87.4
                      2005-06       97.8       96.8        93.6
Quasi-Experimental Design:
IMPACT and Comparison Teacher Retention,
Year 1-3

    100
     95
     90
     85
     80
                                               IMPACT
     75
                                               Comparison
     70
     65
     60
          0-3   4-7     8-10     11-15   >15
                 Years of Experience
               Quasi-Experimental Design:
               Implementation (STNA)
          Impact schools were rated more highly by teachers in all 13 areas:


                 Vision and leadership                             Classroom practice-
                 Technology planning,                               instructional
                  budgeting, evaluation                              strategies
                 Supportive                                        Classroom practice-
                  environment for risk-                              planning
                  taking                                            Student activities
                 Resource media,                                   Teaching practices,
                  software tools                                    Student outcomes
                 Community linkages                                 (perceived)
                 Professional
                  development
All effects significant at p<.001, partial 2 ranged from 0.05 to 0.43
               Quasi-Experimental Design:
               Implementation (IMPACT rubric)
           Impact schools were rated more highly by teachers in 8/16 areas
           over 3 years:

                  Instruction
                  Collaboration
                  Needs assessment
                  Managing
                   resources
                  Designing facilities
                  Policies
                  Planning
                  Evaluation
All effects significant at p<.05, partial 2 ranged from 0.33 to 0.68
Quasi-Experimental Design:
Leadership Ratings (LPI)

   All IMPACT principals who were present for
    all three years of the grant were rated
    more highly in Year 3 than in Year 1 on all
    5 constructs (Challenging the Process,
    Inspiring a Shared Vision, Enabling Others
    to Act, Modeling the Way, and Encouraging
    the Heart).
   These principals grew most in ―Challenging
    the Process‖ and ―Inspiring a Shared
    Vision.‖
Quasi-Experimental Design:
LPI Principal Ratings, Year 1-Year 3


              10
                                                        Year 1 (N=95)

              9.5                                       Year 3 (N=208)

               9
 Mean Score




              8.5

               8

              7.5

               7
                    CTP   EOA        ETH         ISV   MTW
                                LPI Constructs
Quasi-Experimental Design:
Leadership Team Ratings on LPI

   On all 5 constructs, media coordinators
    out-scored principals, in absolute terms.
   On 4 of 5 constructs, technology
    facilitators out-scored principals, in
    absolute terms.
   These findings indicate that IMPACT
    teachers value the leadership qualities of
    media coordinators and technology
    facilitators, and that these individuals are
    seen as better leaders, in some respects,
    than school principals.
Quasi-Experimental Design:
LPI Ratings for Media Coordinators,
Principals, and Technology Facilitators




                 10                                       MC (N=257)

                 9.5                                      Principal (N=277)
    Mean Score




                  9                                       TF (N=247)
                 8.5
                  8
                 7.5
                  7
                       CTP   EOA        ETH         ISV   MTW
                                   LPI Constructs
Quasi-Experimental Design:
Teacher Outcomes (ISTE NETS-T)

  3
2.9
2.8
2.7
2.6
                                                  IMPACT
2.5
                                                  Comparison
2.4
2.3
2.2
2.1
  2
      Year1- Year1- Year2- Year2- Year3- Year3-
       pre    post   pre    post   pre    post
Quasi-Experimental Design:
Teacher Outcomes (attitudes toward
technology)
IMPACT teachers showed stronger change in attitudes or
more positive attitudes overall on:

   Perceived utility of IT
   Email
   Internet
   Multimedia
   Productivity-teacher
   Productivity – student
 Quasi-Experimental Design:
 Teacher Outcomes (attitudes toward
 technology)
 IMPACT teachers showed stronger change in attitudes or
 more positive attitudes overall on:



                         IMPACT     IMPACT     Comp.     Comp.     F for interaction
      Subscale            Time #1    Time #6   Time #1   Time #6    (all df 1, 197)      2
Teacher-centered           4.33       4.22      4.49      4.03     14.57***            .07
activities

Constructivism             3.17       3.52      3.26      3.24     9.85**              .05


Technology utilization     2.90       4.06      2.87      2.82     83.24***            .30
                           Quasi-Experimental Design:
                           Teacher Stages of Concern Years 1-3
                     100

                     90
                                                                                   Year 1 (N=389)
                     80                                                            Year 2 (N=264)
                                                                                   Year 3 (N=287)
                     70
Percentile Ranking




                     60

                     50

                     40

                     30

                     20

                     10

                      0
                           Stage0   Stage1   Stage2   Stage3   Stage4   Stage5   Stage6
                Quasi-Experimental Design:
                Student Use of Computers in Grades 3-5,
                2004-05


                           90%                                                                 IMPACT schools
                                                                                               Comparison schools
                           80%
Percent Responding "Yes"




                           70%
                           60%
                           50%
                           40%
                           30%
                           20%
                           10%
                           0%
                                 Core subject areas   Research for Reports   Word Processing   Presentations
                            Quasi-Experimental Design:
                            IMPACT v. Comparison Media Center
                            Visitation, Year1-Year 3



                            2.5
Average # Visits Per Week




                            2.0


                            1.5


                            1.0                                  IMPACT
                                                                 Comparison
                            0.5


                            0.0
                                  2003-04   2004-05    2005-06
             Quasi-Experimental Design:
             IMPACT v. Comparison Math Achievement
       261

       259

       257

       255
                                                                               IMPACT
       253                                                                     Compar.

       251

       249

       247
         2002-03           2003-04           2004-05


Effect significant at p<. 0001, controlling for grade, race, exceptionality, Free/reduced
lunch, sex, absenteeism
                                          Quasi-Experimental Design:
                                          Reading Growth 2003-2005, by Grade
                                         20
    EOG growth from baseline to Year 2




                                         18
                                         16
                                         14
                                         12
                                         10                                 IMPACT
                                          8                                 Comparison

                                          6
                                          4
                                          2
                                          0
                                              Grade 3   Grade 5   Grade 8

Effect significant at p<. 05, controlling for free/reduced lunch, race, exceptionality,
sex, absenteeism, parent education
Case study: process
   In the 2004-2005 school year, a preliminary case
    study of one intervention school’s community
    outreach program was conducted

   Data sources included
       Phone interviews with patrons
       Structured interviewed with staff
       Archival documents (e.g. attendance data, course
        offerings, budget data)

   In 2005, funds for the case study were redirected
    to the experimental design component
Case study: outcomes

   Findings suggest that low-cost
    technology alternatives can be
    beneficial to school-based
    community outreach programs
   At the same time, personal
    attributes of key staff played a
    pivotal role in the success of
    programming.
Experimental Design: Process

   In 2004, Schools Attuned was
    selected as the intervention in the
    experimental design
   However, a different intervention
    (IRCMS) was approved and
    implemented, beginning in the
    spring of 2005.
Experimental Design: Process
(continued)

         IMPACT                 Non-IMPACT

                   23                    14
IRCMS         Classrooms             Classrooms
          rd
Program (3 =10, 4th=5, 5th=8)     rd
                                (3 =5, 4th=4, 5th=5)
             (570 students)        (267 students)


                22                       16
No         Classrooms                Classrooms
IRCMS (3rd=5, 4th=7, 5th=10)      rd
                                (3 =6, 4th=5, 5th=5)
Program   (497 students)           (351 students)
Experimental Design: Student Measures

   Measures
    Gates-MacGinitie Reading
     Comprehension Test
    Reading EOG
    Metacomprehension Strategy Index
    Jr. MAI (Metacognitive Awareness
     Inventory)
    Reading Efficacy
    Teachers’ rating of student
     metacognition
Experimental Design: Teacher Measures

Measures
 Technology use survey

 TAC survey

 DeFord Theoretical Orientation to Reading
  Profile (TORP) (pretest only)
 MAI (Metacognitive Awareness Inventory)

 Teachers’ Sense of Efficacy Scale (TSES)

 Teaching Reading Efficacy
Formative Evaluation


SERVE Center at the University of
North Carolina-Greensboro
LANCET Implementation and
Outcomes

  Capacity for Applying Project
       Evaluation (CAPE)

   www.serve.org/evaluation/capacity/




          Elizabeth Byrom, SERVE
             Jenifer Corn, SERVE
CAPE is…

 A suite of resources, tools, and
 professional development
 activities, designed to help
 educators collect and use data
 to make decisions that will help
 them improve the
 implementation and impact of
 their technology projects.
SERVE’s Role
   Collaborate with partners
   Identify or develop resources and
    tools
   Design and facilitate on-going
    professional development and
    support for school/district team
   Document lessons learned about
    capacity building for project
    evaluation
School/District Teams’ Role
   Create a project logic map
   Develop an evaluation plan for their
    EETT project
   Implement evaluation plan
       Collect and analyze data
       Use data to make informed decisions
       Make adjustments to project
        implementation
Capacity for Evaluation
   Formative Evaluation – used to
    ―monitor and adjust‖ projects, to the
    ultimate benefit of students
   Capacity – the ―organizational
    wherewithal‖ to undertake project
    evaluation, more than just skills and
    knowledge for individuals
CAPE Components
   A Theoretical Framework for Capacity
    Building
   An Evaluation Framework
   A Professional Development Model
Framework for Capacity Building
                   1. Engaging Moral Purpose
                   Engaging teachers’ beliefs, the need or motivation to undertake formative project evaluation (Fullan,
                   2005)
Foundation
Drivers for        2. Understanding the Change Process
Change in          Understanding the change process to engender ownership of evaluation work (Fullan, 2005; Hall &
Schools (Fullan,   Hord, 1984; Horsley & Loucks-Horsley, 1998; Rogers, 1995; Waters, Marzano, & McNulty, 2003)
2005)
The desired
change is the
adoption of the    3. Building Capacity       A. Knowledge, Skills, and Attitudes of Individuals
formative          Collective and ongoing     (Guskey, 1986, 2000)
project            policies, strategies,
evaluation         resources, and other
practices.         actions to increase
                                              B. Resources – infrastructure, tools, people, money, and time needed to
                   organizational power to
                                              adopt the innovation.
                   implement project
                   evaluation
                                              Shared Identity       C. Professional Community
                   (Newmann, King, &
                                              Motivation to work    (Wenger, McDermott, & Snyder, 2002)
                   Young, 2000, as cited
                                              together on
                   in Fullan, 2005, p. 40)
                                              evaluation (Fullan,   D. Program Coherence
                                              2005)                 (Newmann, Smith, Allensworth, & Bryk, 2001)

                                                                    E. Shared Leadership
                                                                    (Lambert, 2002)
 The CAPE          Introduction     Getting Started
 Evaluation       How to Use the     Planning for
 Framework          Resources         Evaluation


   Theory          Outcomes           The Plan
Explaining How        Goals,           Basic
 Your Project     Objectives, and    Components
    Works           Strategies

Data Sources      Implementation     The Report
   Some              Putting the    Communicating
 Examples           Evaluation to    the Results
                       Work

  Examples        Resources Index
Real Evaluation     of Materials
  Plans and
   Reports
Evaluation Planning Tools
   Logic mapping activities and
    templates
   Strategy and objective planning
    templates and guides
   Data-collection planning guides
Evaluation Planning

   Map project logic
   Clarify strategies and objectives
   Define evaluation questions
   Propose benchmarks
   Select methods and measures
   Conduct the evaluation
   Draw inferences from data
IMPACT Model School Logic Map
IMPACT Model School Objective
Planning Guide
Data Sources for EETT Projects
   Technology Needs Assessment
   Classroom Observation
   Technology-Partnership Survey
   Professional Development
    Questionnaire
   Rubrics for lesson plans and student
    products
   Teacher Reflection Log
CAPE Instruments and Protocols

   School Technology Needs
    Assessment (STNA)
   Professional Development
    Questionnaire (PDQ)
   Looking for Technology Integration
    (LoFTI) drop-in protocol
   Technology and School-Family-
    Community Partnership survey
School Technology Needs
Assessment (STNA)
Online STNA

   Bar graphs
   Repeated use
    indicates changing
    needs over time
   Used in about 200 schools to date,
    with more than 7914 respondents
   Now in Version 3.0
STNA Report
    STNA Research Study
Internal Consistency Reliability (N=2094)
   Data analyses showed each of STNA
    constructs and subconstructs to have high
    internal consistency reliability (alpha
    ranged from .807 to .967).
   These results indicates that STNA is a high
    quality survey instrument that provides
    schools and districts with information that
    can be used to make decisions about each
    of the constructs and subconstructs.
    STNA Research Study
Exploratory Factor Analysis (N=2050)
   The initial analyses revealed 13 factors
    with an eigenvalue greater than one,
    accounting for 62.32% of the total
    variance.
   Ten of the 13 factors were largely the same
    constructs initially identified for STNA.
   These results provided strong support for
    the validity of the constructs identified
    within STNA.
Professional Development
Questionnaire (PDQ)
PDQ

   Easily adapted to specific settings or
    activities
   Assesses participants’ perceptions of
    the quality of professional
    development implementation
   Does not provide data about the
    impact of PD activities – whether
    they made a difference
LoFTI


Looking For
Technology
Integration
Classroom
technology
observation
protocol
    LoFTI

   Designed through collaboration with
    team of school practitioners
   Reports a profile of technology use at
    the school level
   Paper-pencil version available
   Palm version almost ready
School-Family-Community Survey
    School-Family-Community Survey

   Designed for a range of
    stakeholders—staff, parents, others
   Use results in making decisions about
    technology for supporting family and
    community involvement efforts
   Version 1.0 is available online or in
    paper-pencil form
       CAPE Professional Development

  Face-to-     Computer-Mediated Professional Development            Face-to-
    Face                                                              Face
                Multiple    Multiple     Multiple     Multiple
Introductory    Projects    Projects     Projects     Projects
                                                                   Culminating
  Event for                                                         Event for
  Teams of     Individual   Individual   Individual   Individual    Teams of
                 Project      Project      Project      Project
 Educators                                                          Educators


          Online Community of Practice (CoP)
CAPE Professional Development
   Academies and Institutes
   Workshops
   Virtual Meetings, conference calls
    and videoconferences (CMPDs)
   Presentations
   Online community of practice
   Teams sharing successes and lessons
    learned
   Technical Assistance
CAPE Professional Development
   CMPD Topics
       Initial Implementation of Evaluation Plan
       Evaluation Management Plans
       Baseline Data Collection
       Maximizing School Buy-in & Community
        Support
       Data Analysis & Interpretation
Notes to Project Leaders

   Identify and address the challenges
    and costs of evaluating
    projects/programs.
   Use team-based planning and
    implementation of evaluations.
   Recognize that collecting data is
    relatively easy—analyzing and
    using data is the hard part. Both
    require a lot of time.
Notes to Project Leaders
   Communicate to generate buy-in.
   Define and share the evaluation
    purpose—needs assessment,
    required reporting, data-driven
    planning, or program improvement?
   Reach consensus on a definition of
    the program or project being
    evaluated.
Notes to Project Leaders

   Separate project implementation
    from impact, and measure both.
   Define the evaluation questions that
    matter to the evaluation purpose.
   Plan for and collect all of the data,
    and only the data necessary to
    answer the questions.
   Manage the evaluation process.
Capacity for Evaluation
       Capacity Building – NC DPI
   IMPACT I Schools

   IMPACT II Schools

   1-2-1 Grant Schools

   IMPACT Academies based on the SERVE ATA Model

   Collaboration Toolkit

   IMPACT Video Series

   NC LEA and Charter School Educational Technology
    Plans
Dissemination Activities
LANCET Dissemination
   NC State Dissemination Activities


                  Ellen Vasu
                  Jason Osborne
                  Lisa Grable
NC State Dissemination Activities
   Publications
       Corbell, K.A., Osborne, J.W., & Grable, L.L. (in
            press). Examining the Performance Standards
           for Inservice Teachers: A confirmatory factor
           analysis of the Assessment of Teachers’ NETS-T
           Expertise. Computers in Schools.
       Osborne, J.W., Overbay, A., & Vasu, E.S. (in press).
           Designing grant proposals and evaluation plans
           in the age of No Child Left Behind. Journal of the
           American Association of Grant Professionals.
       Overbay, A., Grable, L.L., & Vasu, E.S. (2006).
            Scientifically-based research: Postcards from
            the edge. Journal of Technology and Teacher
            Education (JTATE), 14(3), 623-632.
NC State Dissemination Activities

   Manuscripts in Preparation
       Measuring Teacher Attitudes Toward
        Computers and Teacher Attitudes
        Towards Information Technology
       Dimensions of Technology Skills
       Learning Styles and Resistance to
        Change
NC State Dissemination Activities

   Presentations
       26 National and International
        Conference Presentations and
        Workshops
       6 State and Regional Conference
        Presentations
SERVE Dissemination Activities
   CAPE Website
    http://www.serve.org/Evaluation/Capacity/
       5,434 hits since November 10, 2006
   Manuscripts in Preparation
       CAPE Framework, STNA, CAPE PD Model, Data
        Sources for Evaluating Technology Projects
   Presentations
       13 Evaluation Academies/Institutes/Workshops
       13 National Conference Presentations and
        Workshops
       9 State Conference Presentations and Workshops
SERVE Dissemination Activities

   Instruments
       School Technology Needs Assessment
        (STNA) (n=7914)
       Professional Development Questionnaire
        (PDQ)
       Looking for Technology Integration
        (LoFTI) drop-in protocol
       Technology and School-Family-
        Community Partnership survey (n=88)
SERVE Dissemination Activities
   Building Evaluation Capacity Studies
       Microsoft Partners in Learning
       Irvine Foundation study participant
   Spread
       REL-SERVE Evidence-Based Education
       National Center for Homeless Education
       SETDA-Polyvision Study
       Graduate Courses – NCSU, UCF, Johns
        Hopkins
       Dissertation/Thesis – NCSU, UNC
     Dissemination – NC DPI

 IMPACT Grants
 1-2-1 Grants

 IMPACT Guidelines revision

 IMPACT for Administrators

 IMPACT Website
Dissemination – NC DPI

 North Carolina State Board of
  Education Future-Ready Students
 Future-Ready Classrooms initiative
Roadmap to Replicability


NC State University/Friday Institute
for Educational Innovation
Previously Validated Instruments

   State End-of-Grade tests (grades 3-8)
   NC Writing Test (grades 4 & 8)
   NC Computer Skills Test (grade 8)
   Gates-MacGinitie Reading Test (Grade 2, primary schools
    only)
   Computer Attitude Questionnaire (4-8)
   Young Children’s Computer Inventory (K-3)
   Teacher attitude toward technology integration (TAT)
   Teacher attitude toward computers (TAC)
   Stages of concern questionnaire
   Resistance to Change
   Leadership Practices Inventory (LPI)
Reviewed Instruments
   Examined the factor structures of:
       Teachers’ Attitudes Toward Computers (TAC)
       Teachers’ Attitudes Towards Information
        Technology (TAT)
       Performance Standards for Inservice Teachers
       Technology Skills Checklist 3-5
       Technology Skills Checklist 6-8
       School Technology Needs Assessment (STNA)
   Activities of Instruction Survey was also
    reviewed
Other Instruments Used
   Classroom Climate (3-8)
   Teacher and Administrator Demographic surveys
   NETS-A Performance Profile (Administrators)
   IMPACT Rubric
   IMPACT Implementation Checklist
   Classroom Equipment Inventory
   Media and Technology Inventory
Treatment and Control Considerations

   Competitive grant application
    process
   Comparison group incentives
   Cross contamination
   Time intensive matching process
   Requires personal contact with all
    groups
   Attrition
Assessing Students in K-2

   State prohibition of primary grade
    standardized academic assessment
   Expense of appropriate instruments
    and cost of extra testing
    administrators
   Group administration requires one-
    on-one attention
   Young ELLs
Exposure Issues

   Teacher concerns about
    observation/evaluation
   Only a few schools involved in a
    very high profile project
   Desire to ―look good‖
Data Collection
   Paper and pencil or electronic
       Computer access, reduced response rate
       Logistics of distribution and collection of paper
        surveys
   Middle school students- no single
    classroom teacher
   Student information systems
   Formative v. external evaluation
   Site visits, no normal school days
              Navigating the Regulations

                  Obtaining disaggregated student
                   information and interpreting policy
                       Family Educational Rights and Privacy
                        Act
                       Department of Agriculture controls Free
                        and Reduced Lunch Information




Overbay, A., Grable, L.L., & Vasu, E.S. (2006). Scientifically-based research: Postcards from the edge.
            Journal of Technology and Teacher Education (JTATE), 14(3), 623-632.
Roadmap to Replicability


SERVE Center at the University of
North Carolina-Greensboro
LANCET Roadmap to Replicability

 Inferred Insights into Capacity
  Building for Project Evaluation:
    Lessons Learned from the
          IMPACT Schools

     The SERVE Center at UNCG
           Elizabeth Byrom
             Jenifer Corn
 Lessons Learned
Lessons learned are derived from
a content analysis of qualitative
data from focus groups and
individual interviews with
educators in the IMPACT schools.
Framework for Capacity Building
                   1. Engaging Moral Purpose
                   Engaging teachers’ beliefs, the need or motivation to undertake formative project evaluation (Fullan,
                   2005)
Foundation
Drivers for        2. Understanding the Change Process
Change in          Understanding the change process to engender ownership of evaluation work (Fullan, 2005; Hall &
Schools (Fullan,   Hord, 1984; Horsley & Loucks-Horsley, 1998; Rogers, 1995; Waters, Marzano, & McNulty, 2003)
2005)
The desired
change is the
adoption of the    3. Building Capacity       A. Knowledge, Skills, and Attitudes of Individuals
formative          Collective and ongoing     (Guskey, 1986, 2000)
project            policies, strategies,
evaluation         resources, and other
practices.         actions to increase
                                              B. Resources – infrastructure, tools, people, money, and time needed to
                   organizational power to
                                              adopt the innovation.
                   implement project
                   evaluation
                                              Shared Identity       C. Professional Community
                   (Newmann, King, &
                                              Motivation to work    (Wenger, McDermott, & Snyder, 2002)
                   Young, 2000, as cited
                                              together on
                   in Fullan, 2005, p. 40)
                                              evaluation (Fullan,   D. Program Coherence
                                              2005)                 (Newmann, Smith, Allensworth, & Bryk, 2001)

                                                                    E. Shared Leadership
                                                                    (Lambert, 2002)
 Lesson:

Project evaluation is a
complicated process requiring
cooperation among multiple
people; it is important that
everyone involved speak the
same language.
Hint for evaluation capacity
builders…

   Help project management and/or
    project evaluation teams establish a
    glossary of evaluation terms that will
    be used for their project.
   It’s more important for evaluation
    teams to use the same definitions
    than it is for them to use the ―right‖
    definitions.
 Lesson:

In order to build capacity for evaluation,
the purpose or purposes of any
evaluation effort must be meaningful,
explicit, and understood by everyone
involved.
It helps tremendously if everyone
involved believes in the purpose of the
evaluation.
Hint for evaluation capacity
builders…

   Because purposes for an evaluation
    may differ at various levels (SEA, LEA,
    school, IHE), it’s important to clarify
    the different purposes. Make sure that
    everyone participating in the evaluation
    understands each organization’s
    purposes, roles, and responsibilities.
 Lesson:

Learning how to evaluate a project
requires change, and change takes
time and energy.
 Change, cont’d …

Evaluations can change not only
projects, but also the people
implementing the projects.
Hint for evaluation capacity
builders…
   Help educators understand that
    they are going through a change
    process. From time to time, help
    them reflect on where they are in
    the process.
   Show project leaders how they can
    use already-dedicated time when
    asking teachers to participate in
    evaluation efforts.
Hint for evaluation capacity
builders…


   Understand and prioritize the
    changes being asked of education
    project participants by recognizing
    that some changes are harder than
    others.
 Lesson:

Some specific knowledge and
skills will help make project
participants’ evaluation efforts
more valuable, effective, and
efficient.
Hints for evaluation capacity
builders…
   Actively teach educators how to
    collect, analyze, and interpret data.
   Help educators formalize informal
    data and evaluation practices.
   Show teachers how to use
    technology to access evaluation
    data previously not readily
    available.
Hints for evaluation capacity
builders…

   Educators who are inexperienced
    with project evaluation tend to
    collect the wrong data or too much
    data. Show them how to select and
    use data sources that will be the
    most meaningful for their projects.
Hints for evaluation capacity
builders…

   Find out what data educators are
    already collecting, and if appropriate
    and feasible, show them how they
    might use the data for their project
    evaluation.
Hints for evaluation capacity
builders…

   Don’t be surprised if some project
    stakeholders are reluctant to
    provide necessary data. This can
    happen especially when
    stakeholders do not see value in the
    evaluation.
Hints for evaluation capacity
builders…

   Help educators learn to provide
    feedback to stakeholders, showing
    the results and findings of the data
    collected.
    Hints for evaluation capacity
    builders…

   Don’t be surprised if administrators
    and teachers ―streamline‖ data
    collection procedures or instruments.


   Don’t be surprised if teachers use
    their new evaluation knowledge and
    skills in their own teaching.
 Lesson:

Success of a project evaluation –
and likely of the project itself –
depends on participants sharing
a sense of identity around the
effort.
 Identity cont’d…

Leadership of project evaluation
might come from unexpected
individuals, but regardless of
where it comes from, leadership is
most effective when shared.
Hint for evaluation capacity
builders…

   Help educators develop a plan for
    actively sharing their project and
    evaluation plans, activities, and
    results with stakeholders.
    Hint for evaluation capacity
    builders…

   Help project participants and those
    who are evaluating the project
    reach a consensus understanding of
    ―how the project is supposed to
    work.‖
Hint for evaluation capacity
builders…
   If logic mapping is considered
    worthwhile, show educators how to
    use them early in the project
    planning process. Allow enough
    flexibility for teams to illustrate
    their actual understanding of how
    their project works, i.e., don’t be
    rigid about their using a particular
    logic map design.
 Lesson:

The leadership, shared
understandings, and sense of
community required for effective
project evaluation are heavily
dependent on good
communication.
Hint for evaluation capacity
builders…

   Help educators develop a plan for
    communication among everyone
    involved, such that communication
    is early, often, and in ways that
    support their efforts.
NCDPI Roadmap to Replicibility

   IMPACT Products
   SBE Future-Ready Agenda
   Future-Ready Classrooms
In compliance with federal laws, N C Public
Schools administers all state-operated educational
programs, employment activities and admissions
without discrimination because of race, religion,
national or ethnic origin, color, age, military
service, disability, or gender, except where
exemption is appropriate and allowed by law.

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:21
posted:8/12/2011
language:English
pages:109
Description: Teachers Attitudes Towards Information Technology document sample