Document Sample


        There is concern among educators on how to best utilize assessment results to

improve instructional practices for students. Some suggest many of the tests we use for

measuring achievement and intelligence may not inform instruction (Thurlow &

Ysseldyke, 1982;Elliott & Fuchs, 1997). Court cases, however, advise educators that

assessment must relate to instruction. In a review of Marshall et al. v. Georgia (1985),

Reschly, Kicklighter, and McKee (1988) wrote "...assessment procedures focusing on

correlated traits like general intellectual functioning are not as clearly related to

interventions and are therefore more difficult to justify, particularly if used as the sole or

the primary basis for significant classification/placement decisions" (p. 9). Clearly, there

is a need for alternative assessment models that will assist our instructional planning for

students (Stiggins, 1987).

        A solution to improving on the use of standardized tests is to implement the

Problem Solving Model (Deno, 2003). The primary components of this model are (a)

problem identification and analysis; (b) intervention design and implementation; and (c)

ongoing monitoring and evaluation of intervention effects. The process is databased,

evaluates instruction at key decision-making points, and emphasizes the use of

performance-oriented and multidimensional assessment procedures. Performance-

oriented assessment procedures provide data specific to the identified areas of concern

and to the assessment questions generated through the Problem Solving process.

Multi-dimensional assessment provides information about environmental, curricular, and

instructional variables as well as student variables (Tilly, Grimes, & Reschly; 1993).

        Since 1993 MPS staff have implemented a Problem Solving Model in the District

(Marston, Muyskens & Lau, 2003). The decision-making flow utilizing problem
identification, intervention design, and systematic progress monitoring across three

stages of implementation is illustrated in the figure below.

      Student is Discrepant from School and/or Parent Expectation

                 Stage 1: Within Classroom Intervention

               Student Does Not Respond to Interventions

                  Stage 2: Intervention Assistance Team

               Student Does Not Respond to Interventions

                       Stage 3: Student Support Team

               Student Does Not Respond to Interventions

                                    Student Needs

       At Stage 1 the general education classroom teacher defines the problem, delivers

modified instruction, and systematically evaluates the impact of instruction.

       If alternative interventions do not work within the classroom a building intervention

assistance team addresses the needs of this student at Stage 2. This team opens up

access to more resources in the school by developing alternative interventions, which

may include: re-mediation from building specialists or educational assistants, Title I

support, help from Limited English proficiency staff, and/or consultation from special

education staff (Self, Benning, Marston, & Magnusson, 1991).
       If interventions are not effective at Stage 2, the student moves to Stage 3 where

the Student Support Team, which includes special education staff, school social worker,

and building psychologist, examines the student's difficulties. At this point due process

begins and more intensive interventions are attempted. At each stage in the Problem

Solving Model, school staff repeats the 3 step process of identifying the problem,

developing an appropriate instructional strategy, and then systematically evaluating the

effectiveness of that intervention.

       If a student does not respond to interventions at each of the stages, he or she is

identified as a "Student Needing Alternative Programming (SNAP)," and more intensive

interventions are planned for the student. The main point to be made here is: the

student's instructional needs are not identified through standardized IQ and

achievement tests, but instead through the systematic evaluation of their academic

performance as a function of trying a continuum of progressively more intensive

instructional interventions. This is a model to benefit all students.

       Currently, the district is engaged in the implementation of a Voluntary

Compliance Agreement with the office of Civil Rights. An essential feature of this

agreement is the screening of students with academic and behavior needs, providing

general education interventions, and monitoring student progress and response to

instruction. These key elements are consistent with district implementation of PSM and

Response to Intervention model.

       In addition, as part of the District’s commitment to literacy and student work, all

teaching staff will be encouraged to review student work to improve instruction and build

a shared understanding of standards. The major tool in their effort, teachers teaming

together to review student work, can be used effectively in the three stage Problem

Solving Model.

       In all three stages of the Problem Solving Model process, teachers are asked to

continually monitor the student's response to new instructional approaches tried with the

pupil. When these students -- typically those who are not meeting district Curriculum

Grade Level Expectations -- do not respond to implemented interventions, more

intensive instruction is warranted.

       A key factor in implementing the Problem Solving Model is to continually assess

how well student’s respond to the different instructional strategies that are tried with the

student. A major problem is the lack of valid and reliable assessments that can be used

to measure short-term, student progress, and response to instruction throughout Stages

1, 2, and 3 of the Problem Solving Model. There are, of course, many standardized

tests available for assessing student academic performance. However, as Wiggens

(1992) points out, many educators "are concerned that standardized tests do not

accurately measure the capabilities of students who have difficulty coping in the

mainstream: children from disadvantaged families, children with learning disabilities,

children with limited English proficiency" (p. 40). Others criticize standardized tests

because they do not match instruction (Good & Salvia, 1988), are not useful for planning

instruction (Thurlow & Ysseldyke, 1982), and are not sensitive to student growth (Tindal

& Marston, 1990; Carver, 1974).

       Clearly what is needed is an alternative to the standardized tests. Stiggins

(1987) identified "performance assessment" as a viable option and described it as a

model based on teacher observation and judgment of student performance. Within this

model the "performance to be evaluated is defined in terms of content and/or skills, type

of behavior or product to be observed, and performance criteria" (p. 34). To ensure

performance assessments are of the highest quality he recommends they must:
•   "be clear on the purpose of assessment...

•   "communicate effectively...

•   "maximize the validity of the assessment...

•   "maximize the reliability of assessment...

•   "attend to the economy of assessment..."(p. 39).
Performance Assessment of Reading

                        Deno (1985) developed an alternative to standardized tests that is performance

oriented. His model, known as Curriculum-Based Measurement, is clearly a

performance assessment system that focuses on teacher observation of student

academic behaviors in a naturalistic setting. Considerable research shows that Deno

and his associates have met the Stiggens standards, including: purpose (Deno, 1985),

communicate effectively (Wesson & King, 1992), validity (Fuchs, Fuchs, & Maxwell,

1988; Marston, 1989), reliability (Marston, 1989), and economy of effort (Deno, 1985;

Shinn, 1989).

                        By implementing these performance assessment procedures for reading the

educator can conduct evaluations at two levels: individual and program. An example of

individual student evaluation of reading progress and instructional effectiveness is

shown in the figure below.

                        The graph shows the number of words a 2nd grade student reads correctly from

a 2nd grade reading passage during Reading Intervention A and Reading Intervention B.

As can be seen, student growth during the first instructional approach is not evident

whereas the student begins to improve during the second approach. For teachers using

the Problem Solving Model, this type of performance assessment in reading shows what

instruction is "working" or "not working" for students.
 Words Read Correctly

                        50               Intervention A                                Intervention B
                              0      2       4       6       8       10      12      14      16          18   20

Alignment with School Improvement Plans, District Content Standards,
State Tests, Student Work, and OCR Compliance Agreement

           School Improvement Plans. The CBM approach to performance assessment

has also been used effectively as a program evaluation tool (Marston & Magnusson,

1988). Currently, many schools in MPS are using this model for showing reading growth

on goals set in the School Improvement Plan (SIP). In the figure below is an illustration

of how one Minneapolis school analyzed the improvement of its students in reading from

fall to spring at each grade level.

           Following the collection of winter CBM data many schools receive dropline

graphs that illustrate growth for individual students, classroom and grade levels. Using

this data schools can determine students who are on track as well as students needing

instructional interventions. The graphs used by schools to examine student growth

midyear are below.

                                   CBM: Fall/Winter 04/05





       Words Read Correct



                             20                                                                              FALL

                              0                                                                              WINTER
                                  N=   39       39   47       47   34       34   55       55   47       47

                                            1             2             3             4             5

       The box plot graph above represents school-wide CBM data for the fall and

winter broken down by grade level. The scores on the box plot graphs are for

continuously enrolled students with the total number of students listed on the graph.

        The dark line in the middle of the colored bar is the median for the grade, the

bottom of the bar represents the 25th percentile and the top of the bar show the 75th

percentile for the grade level. The “whiskers” (lines extending from the bar) show the

total range of scores with that classroom or grade.

                                                   Grade Level CBM              “Whiskers”-
                                                                                Total range of
                                                   Fall/Winter 2004/05
                                160                                                       75th
                                                                                          for class

                                                                                          for class

           Words Read Correct

                                 60                                                   25th
                                 40                                                   for class

                                 20                                                       WPMF

                                  0                                                       WPMW
                                                 Room                    Room
                                                 XXX                     XXX

The grade level boxplot graphs gives much of the same information as the school-wide

graph. The difference is that the graph represents a given grade level with each of the

individual classrooms represented and listed.
                                                        CBM ROOM ###



                                  140                                               65th

             Words Read Correct




                                                                             Words/min. Fall

                                    0                                        Words/min Winter

                                        Individual student names

Schools also receive dropline graphs for each individual classroom. On the color graphs

provided to the schools the green and red lines indicate the 65th and 25th percentiles for

the winter respectively according to district data. Students at or above the 65th

percentile are projected to pass the MBST reading test in 8th grade 85% and score at

Level 3 or greater on the MCA 85% of the time. Those below the 25th percentile are

considered struggling, and may need a classroom intervention worksheet.

       Content Standards. Archibald (1991) notes that standardized testing often falls

short in the area of measuring how well students meet standards and that alternative

assessments are necessary. But whatever the alternative assessment approach looks

like, Archibald warns it must address two challenges. First, assessment must "measure

what counts." Second, assessment should lead to "teaching toward authentic

standards" (p. 281). The model proposed here should be helpful on both points.

       Recently, National Assessment of Educational Progress (NAEP) (Pinnell, et al.,

1995) concluded the measurement of reading fluency was an essential ingredient in

measuring student progress toward important standards in reading. The National

Reading Panel selected fluency instruction for study in part because of the need
identified by the results of the NAEP (Pinnell, et al., 1995). Results showed that 44 % of

the fourth graders sampled nationwide were disfluent with grade-level stories. They

concluded that the measurement of reading fluency was an essential ingredient in

measuring student progress toward important standards in reading. The National

Reading Panel also cited The National Research Council report, Preventing Reading

Difficulties in Young Children (Snow, Burns & Griffin, 1998) recommendation, “Because

the ability to obtain meaning from print depends so strongly on the development of word

recognition accuracy and reading fluency, both of the latter should be regularly assessed

in the classroom, permitting timely and effective instructional response when difficulty or

delay is apparent

(p. 7).

          Further, the Minnesota Language Arts Standards recognize the importance of

fluency by including the sub-strand: word recognition, analysis and fluency. Minneapolis

Public schools reinforces the importance of measuring reading fluency by providing

guidelines for reporting student progress on the report card.

State Accountability Tests and NCLB

          The extensive use of the CBM measures with both special education and general

education students has provided us with a large database that is reflective of our district

population. This database enables us to refine our measures, develop local norms, and

establish benchmarks for student progress and achieving reading standards.

          In examining the predictive validity of CBM we have found that there is a strong

relationship between the CBM reading measures and the 2006 version of the Minnesota

Comprehensive Assessment in Reading (MCA II). The MCA II is the test used to

measure progress toward academic standards and defining adequate yearly progress

for NCLB. In our analysis of CBM scores we used a statistical equating method to

establish CBM benchmarks that predict the student’s level of performance on the MCA.

The four MCA levels are: Does Not Meet Standards, Partially Meets Standards, Meets
Standards, and Exceeds Standards. The two lines in figure D represent the CBM

benchmarks that predict these levels. Students who have a CBM score that places them

below the lower line are predicted to be at the level “Does Not Meet Standards”.

Students with CBM scores between the two benchmark lines are predicted to “Partially

Meet Standards”. Students with CBM scores above the upper line are predicted to

“Meets Standards” or Exceed Standards”.

                                                              Figure D

                                       CBM Scores by Projected MCA Level

         Words Read Correct

                                                    1F   1W   1S   2F   2W   2S   3F   3W   3S   4F   4W    4S    5F   5W    5S
         S q u are - M eets or E xceed s            6    26   58   36   62   84   57   83   96   78   100   116   96   116   126
         S tan d ard s
         Circle - P artially M eets S tan d ard s   1    17   38   23   44   62   37   68   81   62   81    94    76   90    106

       For example, a teacher who administered the winter CBM measures of oral

reading fluency to their fourth grade classroom determined the following median scores

for three of their students: 75, 90 and 115 words read correct. As can be seen on Figure

D, the student reading 75 words correct falls below the “Partially Meets Standards” cutoff

score of 81, or in the “Does Not Meet Standards” range. The student reading 90 is

projected to fall in the “Partially Meets Standards” range (81-99), while the student

reading 115 is projected to “Meet Standards” or “Exceed Standards” (100 or above).

   The cut point for attaining the “Partially Meets Standards” level corresponds to the

30th percentile on the 2006 CBM norms. The cut point for attaining the “Meets
Standards” or “Exceeds Standards” level corresponds to the 45th percentile on the 2006

CBM norms.

       Data enabling us to establish cut scores for grades 6 through 8 are not yet

available. For these grades we recommend continued use of the data from the 65/85

analysis found in Table X.

                                               Table X

 Grade and           6         6        6       7       7      7       8       8      8
 Season             Fall     Winter   Spring   Fall   Winter Spring   Fall   Winter Spring
 Words Read
 Correct            108       123      137     116       125    134   129     137    146

       Office of Civil Right Voluntary Compliance Agreement (OCR). OCR had

asked MPS to address issues related to the disproportionate numbers of students of

color who were referred to special education. OCR recommended that objective

screening procedures in reading be used to determine those regular education students

needing more intervention in the general classroom and to monitor student response to

these interventions. The reading assessment procedures presented here provide

schools with tools that help schools meet these requirements.