Constructing Validity Evidence for Alternate Assessments: A by a3yruG

VIEWS: 4 PAGES: 64

									Constructing Validity Evidence
for Alternate Assessments: A
 Multi-state Replication Study
       CCSSO’s National Conference on Student Assessment
                           Orlando, Florida



      Ryan J. Kettler, Kris Kaase, Dawn McGrath, & Lisa Ford

                             June 18, 2008

                  The CAAVES project is funded by the
US Department of Education, Office of Elementary and Secondary Education
        Overview
I.     Consortium for Alternate
       Assessment Validity and
       Experimental Studies (CAAVES)

II.     State alternate assessment results
       A.  Mississippi
       B.  Indiana
       C.  Nevada

III.   Overarching Themes


                               CAAVES Project 2008   2
     Research Focus
From the CAAVES proposal (p. 5):
   Generate and evaluate evidence for validity by examining
    the relationship between alternate assessment scores and
    (a) concurrent measures of known constructs and (b)
    demographic characteristics of known groups of students
    with disabilities who are or are not eligible for participating
    in an alternate assessment.
   Provide a series of methodological replication studies
    across states with two common measures and similar
    alternate assessments.

                              CAAVES Project 2008               3
     Addresses Peer Review
     Technical Questions
The Peer Review Guidance (2004) document asks:
   (b) Has the State ascertained that the assessments,
    including alternate assessments, are measuring the
    knowledge and skills described in its academic content
    standards and not knowledge, skills, or other
    characteristics that are not specified in the academic
    content standards or grade level expectations?
   (e) Has the State ascertained that test and item scores are
    related to outside variables as intended (e.g., scores are
    correlated strongly with relevant measures of academic
    achievement and are weakly correlated, if at all, with
    irrelevant characteristics, such as demographics)? (p. 35)
                            CAAVES Project 2008              4
   Multi-Trait Multi-Method Design
Using classical test theory practices where understanding
   the construct(s) measured by a new test is advanced
   through comparisons with established tests that
   measure validated constructs, one can establish
   evidence about the criterion-related validity of resulting
   scores. This approach to validity evidence is commonly
   referred to as a multi-trait multi-method (MTMM)
   design….
As conceptualized by Campbell and Fiske (1959), the
   MTMM approach allows for an integrative multivariate
   framework within which information about convergent
   and discriminant validity is systematically gathered in a
   single study. The MTMM approach is also useful for
   providing evidence about the construct being measured.
                            CAAVES Project 2008                 5
Basic Matrix for Examining Relationships
Among Multiple Constructs




                     CAAVES Project 2008   6
  CAAVES Interpretations of the
  Correlation Coefficient

  Range       Category   Variance                     Description
                         Explained
                                             These two scores do not
-.30 to .30    Weak       9% or less
                                             seem to be related.
                                             These scores represent
.30 to .60    Moderate   9% to 36%
                                             related but distinct constructs.
                                    These scores represent
.60 to .85     Strong    36% to 72% constructs that are highly
                                    related.
                Very                         These scores may represent
.85 to 1.00              72% or more
               Strong                        a single, unitary construct.

                             CAAVES Project 2008                           7
    Concurrent Measure #1: ACES
Academic Competence Evaluation Scales
(DiPerna & Elliott, 2000)
   Academic Skills (33 items: Proficiency Ratings only)
      Reading/Language Arts
      Mathematics
      Critical Thinking
   Academic Enabling Behaviors (40 items: Proficiency Ratings only)
      Interpersonal Skills
      Motivation
      Engagement
      Study Skills
   Scores summarized as competence levels based on national sample

                             CAAVES Project 2008                 8
    Concurrent Measure #2: VABS II

Vineland Adaptive Behavior Scales – II
(Sparrow, Cicchetti, & Balla, 2006)
   Communication
       Receptive, Expressive, Written
   Daily Living Skills
       Personal, Domestic, Community
   Socialization
       Interpersonal Relationships, Play & Leisure, Coping Skills
   Motor Skills
       Fine, Gross
   National norms; standard scores & adaptive levels

                                CAAVES Project 2008                  9
Sample

State           1% Eligible                Non-Eligible
                 Students                     Students
              Fourth   Eighth             Fourth   Eighth
Arizona         44       50                 22       13
Hawaii          26       11                  0        0
Idaho           20       20                 21       20
Indiana         40       43                115       86
Mississippi     16       11                 11       11
Nevada          43       22                 23       10
                    CAAVES Project 2008                     10
    Data Collection

   For each 1% eligible student
       State alternate assessment (Reading/LA & Math)
       ACES
       VABS II

   For each non-eligible student
       State alternate assessment (Reading/LA & Math)
       State general education achievement test
       ACES
       VABS II



                              CAAVES Project 2008        11
Descriptive Statistics




                CAAVES Project 2008   12
Elliott, Compton, & Roach (2007):
Descriptive Statistics for Idaho

Eligibility   Sample      AA                 AA LA     AA Math
                        Reading

                           93.20              43.47    105.91
 Eligible       91
                          (39.61)            (21.60)   (56.45)

   Not                    157.62              67.62    224.76
                40
 Eligible                 (31.04)            (16.60)   (48.46)




                       CAAVES Project 2008                       13
Elliott, Compton, & Roach (2007):
Relationships among Measures

 Content      Eligible?           ACES             ACES     VABS-II
  Area                            Skills          Enablers Composite
             Yes (n = 92)           .30*            .55*     .59*
AA Reading
             No (n = 44)            .59*            .68*
   AA        Yes (n = 91)            .19            .40*     .45*
Language     No (n = 44)            .48*            .60*
             Yes (n = 90)           .29*            .60*     .75*
 AA Math
             No (n = 44)            .71*            .74*



                            CAAVES Project 2008                     14
Elliott, Compton, & Roach (2007):
Correlations for Non-Eligible

                      Alternate Assessment
            Reading               Language       Math
 Standard
 Reading     .46*                         .59*   .67*

Standard
Language      .30                         .49*   .48*

 Standard
   Math       .21                         .55*   .56*



                    CAAVES Project 2008                 15
CAAVES Findings
   The remainder of our presentation will focus
    on current CAAVES findings relevant to the
    alternate assessment for students with
    severe cognitive difficulties.
   State leaders will share findings from their
    own assessments, which generated unique
    patterns.
   Composite findings, overarching themes,
    and next steps will also be discussed.

                    CAAVES Project 2008        16
Constructing Validity Evidence
for Alternate Assessments:
Mississippi


                  Kris Kaase
     Mississippi Department of Education

                 June 2008
Mississippi’s Alternate
Assessment

   Portfolio – with state defined content
   Content domains structured into:
       Clusters (e.g., Counting and Numbers)
           Items (e.g., Student rote counts from
            memory, Student identifies numerals from 0
            to 9)
   Evidence is collected for each cluster
    with representation for all items.
Mississippi’s Alternate
Assessment

   Evidence may be work samples, tests,
    observations, interviews, video/photo, or
    audio tape.
   Evidence must be aligned, representative,
    recent, and reliable.
   Each item is rated for proficiency (non-
    existent, emerging, progress, or
    accomplished).
   Reliability for every teacher established by
    state level independent scoring
   Mississippi – Comparison of AA
   Eligible vs. Not Eligible Students
Grade    Eligible   ACES       ACES    VABS-II State AA State AA
            ?       Skills    Enablers            LA      Math

           Yes      42.92     122.07    72.19     113.88    100.94
         (n > 13)   (9.30)    (32.16)   (9.50)    (35.53)   (27.76)
Fourth
            No       72.00    127.44     90.00    154.45    145.00
         (n > 10)   (13.73)   (35.79)   (15.70)   (14.83)   (11.94)

           Yes       45.31    100.42     77.69    159.00     141.5
         (n > 12)   (19.83)   (41.31)   (13.24)   (46.88)   (41.78)
Eighth
            No       85.80    141.27     96.36    192.00    173.00
         (n > 10)   (32.43)   (32.72)   (18.74)   (9.77)     (5.5)


Note: Table based 2006-07 data.
     Mississippi Relationships
     among Measures

Content     Eligible?      ACES      ACES       VABS-II   State
 Area                      Skills   Enablers   Composite Language

               Yes           .17      .07        .55*
             (n > 25)
Language
                No          .48*      .41*        .34
             (n > 20)

               Yes           .29      -.04       .57*      .87*
             (n > 25)
  Math
                No           .28      .35         .30      .94*
             (n > 20)

  Note: Table based 2006-07 data.
Constructing Validity Evidence for
Alternate Assessments: Indiana




                  Dawn McGrath
         Indiana Department of Education

                   June 2008
Indiana
   Alternate Assessment: ISTAR (Indiana
    Standards Tool for Alternate Reporting)
   Web-based instrument used since 2003
   Earlier validity evidence contributed to
    Indiana receiving “full approval with
    recommendations” in June 2006 through
    the peer review process.
    Indiana System
   System Website: https://ican.doe.state.in.us
Indiana System
Indiana System
                   Practical
Policy                Utility




                 Psychometric
                 Sufficiency
Indiana System
Assessments integrated with goal setting and progress monitoring
Indiana Criteria
1. Evidence of a Significant Cognitive Disability

2. Precludes the student’s ability to acquire,
   maintain, generalize and apply academic skills
   even with extensive, intensive, pervasive,
   frequent and individualized instruction

3. Impedes participation in and completion of
   general education curriculum even with
   significant modifications
 Indiana Exclusions
a. Excessive or extensive absences
b. Social, cultural or economic differences
c. The mere existence of an IEP or
   identification in a specific disability category
d. A specific special education placement or services
e. Emotional, behavioral or physical challenges
f.   Anticipated scores on general education test
g. Concern for AYP calculations
Indiana Eligibility
   SEA reviews the profile of students who have been
    determined eligible to participate in ISTAR.
   SEA identifies student profiles that do not appear to
    meet criteria and notifies LEA of need to present
    additional evidence for appeal process.
   SEA may determine that a student is ineligible to
    participate in ISTAR based on the lack of evidence
    of a significant cognitive disability or evidence that
    the decision was based on an excluded factor.
   In this case, the score counts as a “no pass” for
    AYP calculations.
Indiana 1% Population
   About 0.7% of the population (4,000)
    credited with “proficient” through
    alternate achievement standards
   Around 1,000 students counted as
    “participating” but “not passing”
   SEA denies eligibility to 300-800 students
    any given year. These count as “not
    passing” and “not participating”.
Indiana 1% Population
   Sample group mirrored population
    distribution in state.
   ISTAR correlation to demographic
    variables:
                   AA                      AA
               Language Arts            Mathematics
           Grade 4       Grade 8   Grade 4      Grade 8


Gender      .02           -.12      -.02          -.07

Race        .07           -.11      .04           -.07
Descriptive Statistics
for Indiana
 Eligibility     ACES                      ACES          Vineland
                Academic                  Enablers


  4th grade     35.88 (5.76)            74.90 (40.57)   54.57 (17.06)
   Eligible         n=40                    n=40            n=37


  8th grade     37.56 (9.13)            89.95 (38.88)   66.59 (13.60)
   Eligible         n=43                    n=43            N=41


4th grade Not   71.34 (27.18)          129.49 (46.74)   93.64 (12.49)
   Eligible         n=115                  n=115            n=114


8th grade Not   61.84 (24.58)          124.08 (31.02)   88.27 (10.73)
   Eligible         n=86                    n=86            n=82
                                                                        33
                           CAAVES Project 2008
   Correlational Analysis of
   ACES to ISTAR
                          AA Language Arts     AA Mathematics
                          Grade 4   Grade 8   Grade 4   Grade 8

Academic Skills Total       .67       .70       .64       .71
Reading/Language Arts       .64       .69       .60       .64
Mathematics                 .58       .59       .58       .66
Critical Thinking           .59       .70       .57       .70
Academic Enablers Total     .65       .50       .53       .40
Interpersonal Skills        .44       .18       .34       .11
Engagement                  .64       .47       .51       .39
Motivation                  .50       .53       .39       .44
Study Skills                .65       .51       .56       .42
Correlational Analysis of
Vineland to ISTAR

                      AA Language Arts    AA Mathematics
                      Grade 4   Grade 8   Grade 4   Grade 8
Adaptive Behavior
                        .70       .64       .67       .58
Composite
Communication           .73       .64       .70       .59
Daily Living Skills     .71       .65       .71       .61
Socialization           .67       .56       .59       .45
Motor Skills            .62       .39       .65       .43
Indiana Findings
   ISTAR correlated weakly with gender and race.
   ISTAR generally correlated with ACES Academic
    Skills and Vineland communication, daily living
    skills in the moderately strong to strong range.
   ISTAR generally correlated with ACES Academic
    Enablers and Vineland socialization and motor skills
    in the moderate range.
   Resulting scores clearly differentiate students who
    are eligible from those who are not.
   Study supports measurement content and
    sensitivity of the ISTAR and suggests further work
Constructing Validity Evidence
for Alternate Assessments:
Nevada


                Lisa Ford
      Nevada Department of Education

                June 2008
Nevada Alternate Scales of
Academic Achievement

     (NASAA)
NASAA Overview

   The Nevada Alternate Scales of Academic
    Achievement (NASAA) is the statewide
    alternate assessment for students assessed
    against alternate achievement standards.

   The NASAA assess students’ academic
    performance through direct observation of
    specific tasks administered by the
    classroom teacher.
What academic skills are
assessed using the NASAA?

 The NASAA assesses seven mandatory
 academic strands. Three strands are
 assessed in English Language Arts (ELA) in
 grades 3-8 and 11; three strands are
 assessed in Math in grades 3-8 and 11; and
 1 strand I assessed for Science in grade 5,
 8, and 11. (See Table 1 for a list of
 academic strands by domain)
   The strand level refers to the strands within a
    standard.

   The benchmark skill level (see Table 2) refers to the
    level of the complexity within each strand

   Academic activities are assessed by assigning a
    score to the students’ performance based on
    accuracy (0 through 3) and the level of assistance
    (0 through 3).
   Two benchmarks level skills are assessed for each
    to the three strands in ELA, Mathematics, and the
    one strand that is assessed in Science for grades 5,
    8, and 11.

   The total of the benchmark scores are calculated
    and a raw score is then generated for each specific
    domain. These raw scores are used to determine
    the students’ level of proficiency.
How is the NASAA
administered?                                        ?



   The NASAA is both a formative assessment of
    performance (measuring progress) and a
    summative assessment of proficiency.
   The testing window for the NASAA runs from
    September through January of each school year.
   Benchmarks are chosen based on goals and
    objectives set forth in the IEP.
   Once the benchmark skills and academic activities
    are determined, the teacher will engage the student
    in a video taped performance of the academic skill
    being assessed.

   The teacher will use the scoring rubric to determine
    to score for the student’s level of accuracy and level
    of assistance and enter these scores into the
    testing profile software.
Videos and Event Recording Performance
Sheets will then be sent to the state to
undergo a validation of scores. These
validated scores will then be utilized by the
state to assign student levels of proficiency
in each of the assessed academic content
domains.
How is proficiency
determined?
   Proficiency ratings for ELA, Math and Science were
    determined by a group of state stakeholders who participated
    on a Standards Setting Committee to determine content level
    proficiency (e.g., Emerging Developing, Approaching
    Standard, Meets Standard and Exceeds Standard).

   Proficiency ratings are then reported to the schools, and are
    utilized when reporting district and school level Adequate
    Yearly Progress (AYP) determinations.

   Individual student proficiency is discussed with parents and
    used by the IEP Committees to monitor academic progress
    and determine present and future programming needs.
Table 1: Content Strands
Strand         Description of Strand
   ELA
E: 2.0.1       Studen ts use reading process skills and strategies to build comprehension; pre-reading
               strategies
E: 8.0.4                                                                                        s
               Studen t listen to and evaluate oral communication for content, style, speakerÕ purpose,
               and audience appropriateness; following directions
E: 10.0.1      Studen ts participate in discussion to offer information, clarify ideas, and support a position,
               conversations and group discussions.

Mathematics
M: 2.0.1       T o solve problems, communicate, reason, and make connections within and beyond the
               field of mathematics, students will use algebraic method to analyze, illustrate, extend, and
               create numerous representations (words, numbers, tables, and graphs) of patterns,
               functions, and algebr aic relations as modeled in practical situations: patterns.
M: 3.0.2       T o solve problems, communicate, reason a make connections within and beyond the field
               of mathematics, students will use appropriate tolls and techniques of measurement to
               determine, estimate, record, and verify direct and indirect measurements; measurement.
M: 4.0.1       T o solve problems, communicate, reason, and make connections within and beyond the
               field of mathematics, stude nts will identify, represent, verify and apply spatial relationships
               and geometric properties; two-dimensional shapes.
Sc ience
S:1.0          T o systematically examine the natural wo rld, use data, record-keeping, and safe
N.5/N.8/N.12   experimentation to conduct scientific inquiry.
   Table 2: Benchmark
   Skill Levels
CBK Level:
Complex Benchmark Level   Academic assessment of skills closely linked to the content
                          standard




CEB Level:                Academic assessment extended to student with more
Complex Extended          significant disabilities.
Benchmark




LBK Level:
Less Complex Extended     Academic assessment extended to students with the most
Benchmark                 severe disabilities.
Results: NASAA Validation
Evidence

   The relationships between the constructs measured
    by the NASAA are larger than the correlations
    between the various ACES academic Skills
    subscales and the NASAA content areas of ELA
    and mathematics.

   The NASAA-to-VABS-II Adaptive Behavior
    composites and the NASAA-to-ACES Academic
    Enablers total correlations were consistently larger
    (See Tables 3 and 4) than NASAA-to-ACES
    Academic Skill subscales.
   Table 5 provides the means and the
    standard deviations of ACES ratings by
    teachers of students with disabilities who did
    not qualify to take the NASAA, but instead
    took the statewide achievement test.
   Table 6 shows the means for ACES
    Academic Skill total and Academic Enablers
    total are slightly higher then those of the
    eligible group.
   Table 7 demonstrates that the NASAA ELA
    score correlated positively with the large
    scale language arts and mathematics
    scores at 8th grade, but negatively at the
    4th grade.

   This may suggest that the NASAA and large
    scale assessments may tap unrelated
    aspects of performance.
Table 3: Results of Correlational Analyses of AA
Subscales Scores with ACES Scores
Table 4: Results of Correlational Analyses of AA
Subscale Scores with Vineland Scores




                            AA Language Arts      AA Mathematics

                            Grade 4    Grade 8   Grade 4    Grade 8
      Adaptive Behavior       .13        .77       .46        .80
      Composite             (n = 44)   (n = 8)   (n = 44)   (n = 9)
                              .18        .81       .48        .88
      Communication
                            (n = 45)   (n = 8)   (n = 49)   (n = 9)
                              .06        .85       .44        .85
      Daily Living Skills
                            (n = 49)   (n = 8)   (n = 49)   (n = 9)
                              .21        .36       .39        .37
      Socialization
                            (n = 49)   (n = 8)   (n = 48)   (n = 9)
                              .10        .73       .54        .63
      Motor Skills
                            (n = 48)   (n = 7)   (n = 48)   (n = 8)
Table 5: Means and Standard Deviations for the
Academic Competency Evaluation Scales (ACES) by
Eligibility for the Alternate Assessment


                                      Grade 4                           Grade 8
                             Eligible         Ineligible       Eligible         Ineligible
                            Mean (SD)        Mean (SD)        Mean (SD)        Mean (SD)
                           44.50 (15.34)    47.33 (13.97)    39.50 (10.19)    33.00 (0.00)
Academic Skills Total
                             (n = 32)          (n = 15)         (n = 6)          (n = 2)
                           14.76 (4.90)     16.38 (5.58)     14.67 (5.68)     11.00 (0.00)
              e
Reading/Languag Arts
                             (n = 34)          (n = 16)         (n = 6)          (n = 2)
                           10.67 (4.04)     11.69 (3.89)      9.50 (2.35)      8.00 (0.00)
Mathematics
                             (n = 33)          (n = 16)         (n = 6)          (n = 2)
                           19.15 (6.88)     19.73 (5.74)     15.33 (3.27)     20.33 (10.97)
Critical Thinking
                             (n = 33)          (n = 15)         (n = 6)          (n = 3)
                          101.77 (31.17)   125.81 (33.04)   135.40 (56.15)   104.33 (32.72)
Academic Enablers Total
                             (n = 26)          (n = 16)         (n = 5)          (n = 3)
                           34.00 (7.95)     39.00 (8.38)     39.33 (16.07)    37.67 (11.06)
Interpersonal Skills
                             (n = 35)          (n = 16)         (n = 6)          (n = 3)
                           19.82 (9.49)     24.25 (9.10)     22.33 (12.18)    16.33 (6.43)
Engagement Skills
                             (n = 34)          (n = 16)         (n = 4)          (n = 3)
                           24.33 (8.87)     30.69 (9.13)     30.33 (15.20)    22.67 (8.33)
Motivation
                             (n = 33)          (n = 16)         (n = 6)          (n = 3)
                           25.11 (8.94)     31.88 (9.51)     33.67 (16.10)    27.67 (11.06)
Study Skills
                             (n = 28)          (n = 16)         (n = 6)          (n = 3)
   Table 6: Means and Standard Deviations for State
   Assessments by Eligibility Status




                                   Grade 4                         Grade 8
                           Eligible       Ineligible       Eligible       Ineligible
                         Mean (SD)       Mean (SD)       Mean (SD)       Mean (SD)
                         32.42 (4.62)    35.35 (1.23)    25.23 (8.65)    30.70 (3.95)
Language Arts
                           (n = 43)        (n = 23)        (n = 22)        (n = 10)
                         30.58 (5.05)    34.52 (2.09)    24.00 (7.69)    29.20 (4.29)
Mathematics
                           (n = 43)        (n = 23)        (n = 22)        (n = 10)
Large-scale Assessment                  183.92 (58.35)                  244.80 (34.51)
Reading                                    (n = 25)                        (n = 10)
Large-scale Assessment                  183.04 (77.84)                  198.18 (62.62)
Mathematics                                (n = 25)                        (n = 11)
Table 7: Results of Correlational Analyses of AA
Subscale Scores with Nevada’s Large Scale Assessment
for Students with Disabilities Who Did Not Qualify to
Participate in the AA




                      AA Language Arts AA Mathematics
                      Grade 4 Grade 8 Grade 4 Grade 8
        General         -.36     .55     -.07     .20
        Language Arts (n = 22) (n = 9) (n = 22) (n = 9)
        General         -.32     .88     .00      .48
        Mathematics (n = 21) (n = 10) (n = 21) (n = 10)
Building Validity Evidence: A Discussion
of the Findings


   This investigation focused on providing more
    information about the constructs measured by the
    NASAA and the ability of its resulting scores to
    differentiate between groups of students known to
    be eligible or not to participate in the statewide
    alternate assessment.
   The relationship between the ELA and Mathematics
    achievement level ratings on the NASAA and the
    concurrent scores on the ACES-Academic Skills
    scales for the eligible students varied across grade
    clusters, but in general were modest.
   Correlations for the same score relationships
    increased noticeably in the grouping of not eligible
    students.

   Although the scores between academic skills on the
    NASAA and other measures indicate a meaningful
    amount of shared variance (e.g., 20% to 40%),
    there are cases, particularly at the elementary
    grades, where there was more shared variance with
    the academic enabling and adaptive behavior
    constructs.
Conclusions

   This investigation has advanced understanding of
    what the NASAA measures and its ability to
    differentiate levels of functioning for students with
    significantly different degrees of impairment.

   Findings indicate that the NASAA is not only a
    measure of academic skills, but also of behaviors
    (e.g., social skills, engagement/motivation, and
    study skills) that enable academic function and daily
    living.
   There is a meaningful amount of construct
    irrelevant variance in the NASAA scores, yet the
    scores are functioning rather well in differentiating
    performances by known groups of students.

   This variance will be reduced over time as (a)
    students’ IEPs become directed more at core
    academic skills and (b) teachers continue to receive
    professional development that improves their
    collection of standards-based evidence for NASAA
    ratings.
Composite Findings,
Overarching Themes, and Next
Steps


           Ryan J. Kettler
         Vanderbilt University

              June 2008
Overarching Themes

   Known groups provided validity evidence, as non-
    eligible students outscored eligible.

   Correlations with validated measures
     Moderate with ACES for non-eligible students

     Strong with VABS-2 for eligible students


   Correlations with general assessments were
    moderate for non-eligible students


                      CAAVES Project 2008              62
What is being measured?


              Alternate
             Assessment

      VABS
                                        ACES
                                        Skills
                 ACES
                Enablers

                  CAAVES Project 2008            63
Questions

   What is an acceptable amount of shared
    variance between proficiency tests and
    adaptive behavior for eligible students?

   What is the relationship between academic
    skills, enablers, and adaptive behavior?

   What questions do you have for us?

   Thank you! (ryan.j.kettler@vanderbilt.edu)
                    CAAVES Project 2008          64

								
To top