Documents
Resources
Learning Center
Upload
Plans & pricing Sign in
Sign Out

APA 5th Edition Template

VIEWS: 10 PAGES: 27

									                                                                      SES Effectiveness   1


Running head: NCLB’S SUPPLEMENTAL EDUCATIONAL SERVICES (SES) DEBATE




   No Child Left Behind (NCLB) Supplemental Educational Services (SES) Debate of SES

Effectiveness: Do District Run Supplemental Educational Services Make an Impact on Student

                                      Achievement?

                                      Tracy Alberry

                        California State University San Bernardino.
                                                                              SES Effectiveness    2


    No Child Left Behind (NCLB) Supplemental Educational Services (SES) Debate of SES

 Effectiveness: Do District Run Supplemental Educational Services Make an Impact on Student

                                          Achievement?

      Public school choice and supplemental educational services (SES) are two options under

the “No Child Left Behind Act of 2001” (NCLB) for parents to improve the educational

opportunities for their children. Since the passage of the NCLB Act of 2001, schools in Year 2 of

Program Improvement or above must offer SES to low-income students (those students on free

or reduced lunch). Year 2 Program Improvement schools are schools which receive Title I funds

and have not made Adequate Yearly Progress (AYP) for three consecutive years. SES is tutoring

services offered by companies that are either for profit or nonprofit or even school districts

acting as SES providers.

      Each school district must set aside a minimum of 5% of their overall Title I to pay for

SES. In addition, the district must spend up to 20% of their Title I budget to pay for school

choice and SES combined to service all qualified students who apply. There is a state set per

pupil rate that each district must spend on each student. Costs charged by tutoring companies

(referred to as a providers) ranged from as high as $80 to as low as $20 per hour for one on one

tutoring. Districts may reduce their total 20% spending by 5% if they use the 5% to spend on

public school choice transportation. Public school choice is when the district pays for students

attending a Program Improvement School to be transported to a school not in program

improvement. Districts must by law offer school choice to all Title I schools in program

improvement.

      Interestingly, a school may opt out of the SES requirement of NCLB by choosing to not

accept Title I funds, thus excluding themselves from NCLB mandates such as SES and choice.
                                                                             SES Effectiveness       3


SES is designed to increase student achievement by giving students tutoring in math, reading or

language arts. School districts do not have to fund the total number of students eligible for SES.

The districts may chose to fund students until they have spent 20% of their total title 1 budget.

Should more students apply then they can service, priority is to go to the lowest achieving low

income students.

     Riverside Unified School District (RUSD) in 2005 applied directly to the state to become an

SES provider as a district. RUSD called their program RUSD Academy. SES providers are

separate from school districts and though RUSD Academy was part of the school district, it

would be considered a separate provider and was listed as a provider on the California state list

of approved SES providers. RUSD Academy provided group tutoring services to 283 Elementary

school students at six different elementary schools. Outside SES providers serviced

approximately 500 additional students. The total number of students who received SES services

represented only about one third of the number of students eligible for SES services. It was the

purpose of this report to present findings regarding the effectiveness of RUSD Academy by

studying the students at one of the six RUSD Academy sites, to be referred to as “RUSD SES

site 1”. The English Language Arts (ELA) scores will be used as students were only tutored in

English Language Arts.

    Whether SES increases students achievement is still open to debate. Current research on

SES is limited and lacks conclusive evidence of the relation between SES and student

achievement. Some research analyzes evidence relating to the effectiveness of SES in relation to

student achievement, yet many fail to compare the effects on student achievement of district run

SES programs to nonprofit or for profit tutoring companies offering SES. NCLB requires states

to monitor all providers and track their effectiveness. Monitoring by states is often sporadic and
                                                                             SES Effectiveness        4


is not adequately funded. “Five years after the No Child Left Behind Act became law, there’s

still a dearth of research evidence to show whether one of the federal measure’s least-tested

innovations—a provision that calls for underperforming schools to provide after-school

tutoring—has an impact on student achievement,” (Viadero, 2007, p. 7).

      According to a California Department of Education Official, to remain anonymous, the

state of California is investigating improving the monitoring process of SES though the hiring of

an outside identity to be determined. Based on various measures, studies have demonstrated little

if any gains on state achievement tests by students attending SES. There is a need for additional

quantitative studies examining student progress and the effectiveness of SES over several years,

including studies on what measurement should be used to measure SES effectiveness,

Arguments on the effectiveness of the SES programs exist due to limited number of students

serviced(in RUSD only one third of the qualified students applied for and received services),

lack of state monitoring of SES tutoring providers, and lack of pre and post assessments to

adequately monitor SES provider effectiveness and lack of alternative measures to measure

overall student performance eon state standardized tests..

       The researcher proposed that the effectiveness of outside providers needs to be compared

to the effectiveness of school district offered SES programs. The first step in this process was to

examine the effectiveness of a district offering SES by measuring the student’s increase on ELA

CST scores after receiving tutoring.

      RUSD Academy’s students’ scores on the California Standardized Test (CST) were

examined to see if there was an association between hours tutored and change in California

Standardized Test scores. District run SES programs may only be offered if the school district is

not a program improvement District unless special dispensation is granted, as was the case with
                                                                              SES Effectiveness    5


the Chicago Public Schools (CPS). The Chicago Public Schools was granted special dispensation

by Margaret Spelling, Secretary of Education, and was able to offer SES, regardless of being in

district PI status. They had demonstrated growth for its district run SES program. Showing

effectiveness of SES programs may lead to more school districts offering their own SES

programs even if they are in program improvement.

      Many questions have been raised about SES; Are SES services an effective use of federal

money for increasing student achievement? Who is the best provider of tutoring, outside free

enterprise providers who compete with each other for students and hire tutors without hiring

guidelines or school districts themselves who hire teachers who are considered “highly

qualified”? How can the effectiveness of SES providers be reliably and consistently monitored

on federal, state and district levels? The first step in the process of answering these questions was

to study students at one school district by examining their increase on the ELA CST.

      Current research and current data on SES from RUSD Academy was examined in order to

document SES effectiveness in increasing student achievement. Specifically, one school’s

Elementary students CST scores were examined. The hypothesis was that RUSD Academy

located at School A, as a district run SES provider, was associated with an increase of CST

scores for students who attended 20 or more hours of SES tutoring. Specifically, scores of

students from grades 3-6 of the “2006-2007” school year were examined.

                                          Review of Literature

History of SES

      2002-2003 marked the first year schools in Program Improvement (PI) status Year 2 were

required to offer SES. California County Superintendents Educational Services Association

(CCSESA) published a special issue of the California Curriculum News Report on No Child Left
                                                                               SES Effectiveness        6


Behind Supplemental Services and Choice in February of 2004 containing articles on SES. This

report discussed procedures for districts to follow in setting up SES services and was an

excellent source for information on SES. In the report, according to Carol Brush (2004) a notable

specialist on SES, her “district believes that the supplemental services program can assist

students to attain state standards” (p. 3). The article failed to describe how this will be achieved,

as well as failed to mention how effectiveness will be measured. To date, no specific

measurements of effectiveness have been published. If there were studies or findings, the school

districts and states have not made public any of the findings of providers’ effectiveness, in a

manner useful to parents for selecting providers.

       Rita Brogan (2004) discussed two basic reasons for the Imperial County Office of

Education in California, wanting to become a SES provider: to increase student achievement as

measured on achievement tests and to increase a students desire to learn. Though by design

attendance and commitment is established, the program lacked effective monitoring procedures.

The SES designs found in this article reflected the same weakness found in the NCLB law on

SES which shows a lack of any sort of mandated and consistent monitoring requirements,

suggesting that programs will only reflect the weaknesses and limitations found in the NCLB law

itself. His findings are supported by other researches cited in this paper.

Participation Rates versus Funding Availability

      Currently nationwide, according to NCLB, school districts are required to set aside 20% of

their Title I budgets to pay for School choice and Supplemental Educational Services (SES), with

a minimum of 5% going to each program. This funding availability does not come close to being

able to service all students eligible for SES services. In Riverside where $1.9 million was put

aside for services for the 2007-2008 school year to include transportation costs for “Choice” and
                                                                             SES Effectiveness        7


a per pupil amount of $1244.00 per student is state mandated to be spent on SES tutoring

services. The current amount set aside could service only 1527 students out of the possible 3000

students eligible, if all of the funds went to SES and none to Choice transportation costs. This is

less than a third of the over 3,000 students who would be eligible for the services should they

chose to apply. For the current school year, in reality only 700 students are applying for SES

services from outside the district providers, down 400 from the 900 of the prior 2006-2007

school year when the district was able to offer SES services through the district and hosted at

Riverside schools. As of 2006-2007, RUSD Academy no longer exists and the district does not

offer any SES services except for those tutoring services provided by outside tutoring providers

which could account for the dropping of the number of students applying for SES.

Chicago Public Schools, CPS SES Shows Highest Gains

      The CPS High School Supplemental Education Services (SES) Tutoring

Program study is the evaluation of the 3rd year of SES programs in the Chicago Public School

district. Put together by the Chicago Public Schools (2007) the study examined the 2005-2006

academic year, 55,600 students across 324 schools. Standardized Tests data was used to gauge

student achievement. A small increase was seen in Reading but only a negligible gain in math.

“Participation in the SES program resulted in a small but significant improvement in reading

achievement performance compared to other low-income, low-achieving students, attending the

same schools,” (Office of Research, Evaluation, and Accountability; Office of Extended

Learning Opportunities, 2007, p. 2). Also examined was the evidence of success found in each

individual provider. The Chicago Public School’s own district sponsored tutoring program

showed the most growth in student achievement. This study is unique in its criticism of outside

providers. A subsequent report published in August 2007 by CPS examines all providers in
                                                                                SES Effectiveness         8


relation to 9th grade students. Interestingly, this report claimed there were no significant gains in

student achievement for students attending SES tutoring. The CPS report models effective

research strategies and would serve as a model for other districts to measure SES effectiveness.

PI Districts Lose the Option to Offer District provided SES

      Federal guidelines state that once a school District enters PI status as a district, the district

may no longer be an SES provider of tutoring services. This mandate has resulted in Fontana,

Riverside, Chicago Public schools and other school districts not being able to offer their own

SES tutoring. Chicago Public schools were able through a special dispensation from the Federal

Secretary of Education to offer SES District services by showing data that proved their programs

effectiveness and by agreeing to certain terms of implementation. CPS agreed to have an outside

provider evaluate their program (a sanction not put on outside providers) open their school

campuses to outside providers and to collect and share assessment data. Other school districts

such as Riverside and Fontana have found it necessary to not offer district SES programs.

Initial Program Effectiveness Debate

     Current studies are inconsistent on their results with some providers showing growth while

others not. There has been no consistent measure of effectiveness of the SES in relation to

student achievement. SES providers also vary on the pre and post tests they use to track student

achievement. Methods and curriculum used by each tutoring service also varies. One company

may offer 1:1 tutoring in one district and 1:3 tutoring in another district. Each provider uses its

own unique tutoring program ranging from “Hooked on Phonics” a marketed program, to

computer-based applications. On measuring the effectiveness of the provider on student

achievement, one is essentially measuring also the effectiveness of the materials being used by

the provider. Much research today calls for additional research as there is no consistent
                                                                             SES Effectiveness       9


measurement within states to accurately track effectiveness. It is extremely disturbing that the

state of California does not publish findings when they require each SES provider to collect CST

scores for all students tutored as well as the number of hours the student was tutored. In 2005

Riverside Unified collected data from approximately 200 students from their district-run tutoring

program along with CST scores, yet no published findings by the state of California could be

located.

       Potter, Ross, Paek, & McKay, (2007) in the publication, “Supplemental educational

Services: 2005-2006 (2004-2005 Student Achievement Results)” studied the implementation of

SES services in six school districts across the state of Tennessee required to offer SES in the

2006-06 school year. The major goal of the program was to evaluate the perceived progress and

outcomes of the SES program. In addition the report included a focus on provider services and

outcomes and the Local Educational Agencies (LEA) implementation of SES services. Rubrics

were designed to help evaluate SES providers. There were 33 providers analyzed in this study.

Surveys were used as the main instrument for the study. Though the report looked at Tennessee

schools it was not a statewide study.

       According to Burch (2007) only Minneapolis and Chicago school districts have completed

studies attempting to assess SES’s impact on learning outcomes; “Research to date offers only

limited understanding of what kinds of assessment might be useful in determining the costs and

benefits of various SES models” (p.15). There is a need for all states to find effective means for

measuring SES effectiveness. The evaluation of SES providers is a “relatively new and emerging

endeavor” (Ross, Potter & Harmon, 2006, p. 22). They stated, “Assuming that not all eligible

students participate in SES, maintaining information about “eligible but not participating”

students could lay the groundwork for a possible quasi-experimental (e.g., matched samples),
                                                                               SES Effectiveness 10


student-level design” (p.14). Though there is no standard nationwide assessment measure for

assessing SES effectiveness, it is possible to use assessments such as the California Standardized

Tests as one marker of possible SES effectiveness to assess SES on the California state level.

The state or a school district could use matched samples of CST scores to measure SES

effectiveness.

      Fusareli (2007) reviewed the provisions of SES, school districts’ implementation of SES,

resistance to SES and obstacles to SES. Solutions to weaknesses in the program were presented.

There is an overall call for more research and recommendations for future studies are made by

Fusareli. His conclusions were based on 13 other studies and research papers. NCLB gives states

the responsibility of measuring and reporting student improvement and deciding who should be

eligible to be a SES provider. Yet, current research illustrates the floundering of states in their

search to adequately monitor and measure SES effectiveness. If the federal government fails to

have a national standard, districts such as RUSD should measure and present their own findings

of provider effectiveness if states fail to do so, though the best case scenario would be to have a

national standard of measurement or at least state standards. According to the Electronic

Education Report, “Thirty-eight states report they are unable to monitor the quality and

effectiveness of supplemental education service providers, most often because they have

insufficient staff and inadequate funding to do so, according to a study released this month by the

Center on Education Policy,” (Bowker, 2007, p.7).

     NCLB is currently up for reauthorization. If changes are not made and adopted by the

legislators, NCLB will remain as it is with the same lack of adequacy. It could possibly be years

until any radical changes are made to the legislation, and people realize that a goal of 100%

proficiency in math and English language arts for all students just might not be realistic for all
                                                                              SES Effectiveness 11


students and school districts. For now SES is in place to help move students towards their 100%

Additional Evidence of Achievement, Effective to Inconclusive

      Miners (2007) in “SES Effectiveness Is a Matter of Debate” reviewed two recent reports

on SES with differing opinions using data from different sources. One reported SES is giving

gains in student achievement while the other reported evidence is inconclusive due to states’ lack

of monitoring capabilities. It shows how data from different sources can be used to see two

completely different outcomes. The article highlights states abilities to monitor SES programs.

The article failed to go in depth regarding the data used in the studies.

      Pascopella (2004), in “Signs of Improvement with SES”, reported SES success in 2004. It

reviewed a prior study of 33 schools districts across 47 states. It includes the finding that some

aspects of NCLB are unworkable. It recommends that states lack the resources to carry out the

NCLB laws. The article further supports the argument that more studies on SES effectiveness is

needed.

      Sunderman (2006), in “Do Supplemental Educational Services Increase Opportunities for

Minority Students” goes so far as to argue that supplemental services actually are detrimental to

schools. It makes the point that there is scant research on the effectiveness of SES programs. The

article also claims that money used for SES services would be better off spent elsewhere. He

suggested classroom approaches would be more effective than individual tutoring due to the lack

of support for any relationship between increased student achievement and SES. He reported

though that all students attending a school in year two of program improvement, are eligible for

SES services. Only low income students who are usually on free or reduced lunch are eligible for

SES services. If they pay for students who are not low income, Title I funds may not be used to

pay for those services and the money used may not be counted toward the district’s required
                                                                                SES Effectiveness 12


spending for SES. The article overall offers subjective opinions and not enough of objectives

facts, though he lists a long list of references; his article’s finding may be open to debate.

Proposed Solutions to NCLB Weakness

        There seems to be no consistency between states on implementing and monitoring SES

programs. The policy brief by Patricia Burch (2007) called “Supplemental Services Education

Services under NCLB”, examined several studies and national information to make

recommendations for SES, which is as of 2007, is entering its sixth year of implementation.

Highlighted are the facts that states and districts often face limitations such as lack of funding, in

implementing and monitoring the effectiveness and policies of SES programs. Patricia Burch

(2007) recommended the NCLB law be redesigned. She suggested there be federally funded

evaluations of SES. Also suggested is an examination of the feasibility and desirability of

reallocating Title I funds to state reform efforts. In addition, a closer look should be made of the

inconsistencies of NCLB high stakes accountability imposed on schools and the lack of

accountability on the high stakes tutoring that is supposed to improve student achievement and

improve the status of schools. This article was very clear and concise. Patricia Burch is a reputed

SES specialist who trained new SES coordinators in California during the 2006-2007 school

year.

        Riverside does though offer its former SES services to low-income student attending

schools in danger of program improvement. A question that may be asked is, is it the program a

district offers that makes the tutoring successful or is it the fact that the students are being tutored

in their schools by their teachers? This question may not be easily answered. Current research is

inconclusive to whether or not SES has an effect on student achievement as measured through

their increase on standardized tests. Companies vary on the pre and post test used, the materials
                                                                                  SES Effectiveness 13


used to tutor and there are no clear federal guidelines for state’s to monitor SES effectiveness. In

order for a district to measure SES effectiveness they may be left with no choice but to measure

it themselves using student achievement scores on the CST and the number of hours a student

attends tutoring. A district study may show that an increase in student achievement may correlate

to the number of hours students attend a provider’s tutoring sessions. For the studied completed

in this paper, it was believed that the CST Score change mean of students who received tutoring

would be higher than that of students who did not receive tutoring.

                                            Hypotheses #1

       It was predicted that students at a given Elementary school in grades 3 to 6 of the 2006-

2007 school year, attending District run RUSD Academy for more than 20 hours of SES

tutoring, would demonstrate a positive effect of SES by scoring higher on the 2007 ELA CST as

compared to the 2006 ELA CST than students from the same school not attending SES tutoring.

Students with less than 20 hours of tutoring were excluded from the study.

                            HA = StutoredELACSTscores > SnottutoredELACSTscores

       A comparison of California Standardized Test scores in English Language Arts between

students attending District run SES tutoring and students not attending tutoring, should

demonstrate that District SES run tutoring is associated with an increase in 2006-2007 CST

scores. In addition, the study may show that an increase in student achievement may correlate to

the number of hours students attend a provider’s tutoring sessions. The difference in ELA CST

scores from 2006 to 2007 will be used to measure gains.

                                            Hypothesis #2

       Do the hours tutored predict 2007 CST scores above and beyond the variance explained

by the 2006 CST scores?
                                                                              SES Effectiveness 14


                                              Methods

Subjects and Variables

      For this study student 2006 and 2007 ELA CST scores from RUSD ACADEMY School A

were examined. The possible scaled score range for the ELA CST was from 150 up to 600

points, 150 being the lowest. There were 377 students total at the school with 23 of those

students receiving tutoring. Students who did not have both 2006 and 2007 ELA CST scores

were removed from the analysis. This left 291 students. There were 21 out of 291 students who

received 20 or more hours of tutoring. Of the 291 students studied, 57% were female and 43%

were male. There were 23.7 % 3rd graders, 24.7% 4th graders, 27.5% 5th graders, and 24.1% 6th

graders (see Table 1). There were 220 (75.6%) of the students with low Socio Economic Status,

with 71 (24.4%) who were not in the low Socio Economic Status category. Students were

examined regardless of socio economic status. Some students had missing data in the gender and

socioeconomic status column. These values were looked up on a separate file to verify Socio

Economic Status and Gender. The missing values were then replaced.

Table 1.

Table of Grade levels of Students Studied

                                     Frequency      Percent       Valid Percent

                              3          69             23.7          23.7


                  Grade       4          72             24.7          24.7

                 Level of     5          80             27.5   27.5

                 Students     6          70             24.1          24.1

                            Total        291         100.0            100.0
                                                                              SES Effectiveness 15


Results

       The data was examined for outliers. The 2007 ELA CST scores had a standard deviation

of 43.02. There were no outliers more than 3.5 standard deviations from the mean of 333.83.

The 2006 ELA CST scores had a standard deviation of 50.48 with no outliers 3.5 deviations from

the mean of 333.13. The assumption of homogeneity of variance was met as the standard

deviations were within four times of each other (see Table 2). The students were grouped into

two groups. Those receiving 20 or more hours of tutoring were compared to the second group of

students with 0 hours of tutoring. An independent t-test was run along with a hierarchal

regression.

Table 2.

2007 & 2006 CST ELA Test Scores

                                                                     Std.
                               N          Range         Mean                    Skewness     Kurtosis
                                                                  Deviation

  2007 CST ELA Scaled
                              291        246-468       333.83       43.02          .50         .04
           Scores

  2006 CST ELA Scaled
                              291        234-490       333.13       50.48          .40         -.28
           Score



       After examining changes in test scores there were found to be two outliers. Once the

outliers were removed the following was found to be said about the changes in scores: There

were 289 valid scores with a standard deviation of 34.65 and a mean of 0.71. The outliers

removed were point changes of -130 and 129 (see Table 3). Figure 1 shows how the change in

test scores fit a normal bell curve with a kurtosis of .06 and a curve that is skewed .07.
                                                                       SES Effectiveness 16




Table 3.

Change in CST ELA test scores without Outliers


                     N       Range       Mean    Std. Deviation   Skewness     Kurtosis

  Change ELA
                     289   -110 to 118   .71         34.65           .07         .06
   CST Score

Valid N (listwise)   289




Figure 1. Histogram of change in ELA CST scores from 2006 to 2007.
                                                                              SES Effectiveness 17


      The Pearson Chi-Square was .706 which shows there were no significant differences in the

expected and observed numbers of male versus females enrolled or not enrolled in tutoring. We

expected 181.9 females to not be enrolled in tutoring. We observed 181 females not in tutoring,

showing that there is no difference between the observed and expected females not enrolled in

tutoring. We have 164 males to not be enrolled in tutoring. Males and females are equally

dispersed across tutoring in enrolled tutoring versus not enrolled in tutoring (see Table 4).
                                                                            SES Effectiveness 18


Table 4.
Information for Pearson Chi-Square Tests

                                                 Asymp. Sig. Exact Sig. (2- Exact Sig. (1-
                             Value       df
                                                  (2-sided)        sided)         sided)

     Pearson Chi-Square       .14a        1          .71

           Continuity
                              .03         1          .87
           Correctionb

       Likelihood Ratio       .14         1          .71

      Fisher's Exact Test                                            .83            .44

      N of Valid Casesb       368

     a. 0 cells (.0%) have expected count less than 5. The minimum expected count is
     10.88.
     b. Computed only for a 2x2 table

      As part of data screening, a univariate analysis of variance was run. Table 5 demonstrates

significance higher than .05 with Levene’s test of equality of variances; therefore the assumption

homogeneity of variance was met. This showed that the error variance of the dependent variable

is equal across groups.
                                                                             SES Effectiveness 19


Table 5.

Independent Samples Test

                                                                                  Mean Difference
                                  F     SIG      t       df      Sig. 2-tailed


 Change in         Equal

 ELA CST          variances      .10    .75    1.89     287          .06               15.09

 SCORES           assumed



       An Independent samples t-test was used to compare means of students who received

tutoring, labeled “Y” for yes, compared to those that did not receive tutoring, label “N” for no

tutoring. The results indicate the two groups differed on their change in ELA scores, with

students who received tutoring demonstrating greater gains (t (287) = 1.89, p = .03) (see Table

5). Table 6 shows the mean changes in ELA CST scores of students who were tutored and those

who weren’t tutored. The mean was a 14.75 point increase for those tutored while those that

weren’t had a mean of -.34 which meant that students on average went down .34 points or stayed

the same on their scores if they did not receive tutoring. Mean difference showed a 15.09

difference in the positive direction.
                                                                           SES Effectiveness 20


Table 6.

Group Statistics, Changes in CST ELA Scores

                                                                       Std.

                                       Tutored     N       Mean      Deviation

                      Changes in CST       Yes     20      14.75       36.73

                       ELA Scores
                                           No     269        -.34      34.34


   The mean scores of each test ELA CST 2006 and ELA CST 2007 were also examined. The

2007 scores showed only a mean increase of .71, which is not even a point increase on mean

scores. This could be accounted for by the fact that many students went up in the scores while

many went down as well (see Table 7).

Table 7.

Descriptive Statistics of Variables Used

                                                                         Std.
                                                    Mean                                 N
                                                                      Deviation

      2007 CST ELA Scaled Scores                    333.45              42.86           289

       2006 CST ELA Scaled Score                    332.74              49.73           289



       In Table 8 we see that the only significant association is between the 2006 and 2007 ELA
CST scores (p ≤ .00). Where as there is a positive correlation between 2006 (r = .73) and 2007 (r
= 1.0) ELA CST scores, there is a negative association between ELA CST scores and hours
tutored (r = -.09).
                                                                         SES Effectiveness 21


Table 8.
Correlations

                                                         2007 CST      2006 CST

                                                        ELA Scaled    ELA Scaled

                                                          Scores         Score

                                 2007 CST ELA Scaled
                                                            1.0            .73
                                       Scores
                 Pearson r
                                 2006 CST ELA Scaled
                Correlation                                 .73            1.0
                                        Score

                                    Hours Tutored           -.09          -.15

                                 2007 CST ELA Scaled
                                                            1.0            .00
                                       Scores

               Sig. (1-tailed)   2006 CST ELA Scaled
                                                            .00            1.0
                                        Score

                                    Hours Tutored           .07            .01

                                 2007 CST ELA Scaled
                                                            289
                                       Scores

                     N           2006 CST ELA Scaled
                                                            289
                                        Score

                                    Hours Tutored           289



      A Hierarchal regression was run. In the Hierarchal regression it was shown that you can
                                                                                  SES Effectiveness 22


predict 2007 ELA CST scores based on the 2006 ELA CST test scores. It all showed that hours

tutored does not predict students’ performance on the 2007 ELA CST scores once the 2006 ELA

CST scores were accounted for. The effect size was 53.2% of the variance in 2007 scaled ELA

CST sores is accounted for by ELA CST 2006 scaled scores. Regression results indicate an

overall model of two predictors that significantly predict 2007 ELA CST scores. In table 9 the R2

are presented. (R2 = .53, R2 adjusted is .53, R2 change was .00, F (2, 286,) = 162.99, p  0 (see

Table 10). This model accounted for 53.2% of the variance in 2007 ELA CST scores.



Table 9.

Hierarchical Regression


                           Adjusted R R Square                                                  Sig. F
   Model        R Square                              F Change          df1        df2
                            Square       Change                                                 Change

     1a            .53        .53             .53         326.29        1          287           .000

     2b            .53        .53             .001         .38          1          286           .54

a. Predictors: (Constant), 2006 CST ELA Scaled Score
b. Predictors: (Constant), 2006 CST ELA Scaled Score, Hours Tutored

          In Table 10, according to the ANOVA, the relationships between the DV and the IVs is

linear. The p< 0 showed that test scores can predict 2007 test scores.

Table 10.

ANOVA

                                    Sum of                  Mean Square
                 Model                               df                       F          Sig.
                                    Squares                        MS
                                                                             SES Effectiveness 23


                    Regression   281434.577          1   281434.58       326.2     .00a

            1        Residual    247542.945      287       862.52

                      Total      528977.522      288

                    Regression   281763.875          2   140881.94   162.99        .00b

            2        Residual    247213.647      286       864.38

                      Total      528977.522      288

   a. Predictors: (Constant), 2006 CST ELA Scaled Score
   b. Predictors: (Constant), 2006 CST ELA Scaled Score, Hours Tutored
   c. Dependent Variable: 2007 CST ELA Scaled Scores



       Table 11 presents the standardized values with the Beta values.
Table 11.
Coefficients

                                         Standardized

                         Model           Coefficients      t              Sig.

                                              (Beta)

                          (Constant)                     10.62            .00

                1      2006 CST ELA
                                               .73       18.06            .00
                         Scaled Score

                          (Constant)                     10.29            .00
                2
                       2006 CST ELA
                                               .73       17.92            .00
                         Scaled Score

                        Hours Tutored          .03        .62             .54
                                                                            SES Effectiveness 24


                                        Standardized

                      Model             Coefficients      t              Sig.

                                           (Beta)

                       (Constant)                      10.62             .00

               1     2006 CST ELA
                                             .73       18.06             .00
                      Scaled Score

                       (Constant)                      10.29             .00
               2
                     2006 CST ELA
                                             .73       17.92             .00
                      Scaled Score

                     Hours Tutored           .03         .62             .54

             a. Dependent Variable: 2007 CST ELA Scaled Scores

Discussion

      The sample size may have included almost 300 students, but due to the fact that only 20

of them had 20 or more hours of tutoring suggests that a larger sample of students tutored needs

to be examined before significant findings may be reported. It was shown though that the ELA

CST scores from 2006 act as a predictor of 2007 ELA CST scores, though hours tutored do not

act as a predictor and do not predict 2007 ELA CST scores. There was a significant gain in mean

scores according to the t-test for students who received 20 or more hours of tutoring. Future

studies may consider grouping students based on point increases to determine exactly how many

students increased or decreased their scores. Means by themselves do not show how many

students increased or decreased.
                                                                            SES Effectiveness 25


       According to the Hierarchal Regression hours tutored does not predict an increase in

student ELA CST scores. When one examines the mean scores though, students with 20 hours or

more of tutoring did increase their scores by an average of 14 points whereas those with less than

20 hours had no mean increase in scores .The Hierarchal Regression may have had a small

sample size of only 20 students participating in 20 or more hours of tutoring, which could predict

why hours tutored did not predict 2007 test scores. Further research with larger sample sizes is

needed.

       It is difficult to show associations between an effect and an outcome when students may

be receiving outside effects. Had the data shown there was an effect on CST scores by tutoring

there would have been the question of, were there outside effects at work here? The students who

did not receive tutoring at the same school were therefore used as a control group to explain any

variance there may have been from outside effects. Possible students from various schools could

be examined at one time to increase the sample size, but students from different schools may

experience different effects and the variance between schools would have to be accounted for.

       Overall, this study showed the difficulty in showing the association between ELA CST

scores and hours tutored. The expectation that 2007 ELA CST scores would increase with hours

tutored was shown in the independent t-test but there was significance for hours tutored when

2006 CST ELA scores were accounted for in the Hierarchal regression. The study also showed

that 2007 scores can be predicted from 2006 scores. This information will help in accounting for

variance when studying the effect of hours tutored in future studies.
                                                                          SES Effectiveness 26


                                          References
Ascher, C. (2006, October). NCLB's Supplemental Educational Services: Is This What Our
       Students Need? Phi Delta Kappan, 88(2), 136-141. Retrieved October 1, 2007, from
       Academic Search Premier database.
Bowler, R. R. (2007, March). States Lack Funds and Staff to Monitor Supplemental Ed Services.
      Electronic Educational Reports.

Burch, P. (2007, May). Supplemental education services under NCLB: Emerging evidence and
       policy issues. Retrieved February 3, 2008 from the Great Lakes Center for Education
       Research and Practice Web site:
       http://www.greatlakescenter.org/docs/Policy_Briefs/Burch_NCLB.pdf
California County Superintendents Educational Services Association (CCSESA). (2004,
       February). California Curriculum News Report, A Publication of the Curriculum &
       Instruction Steering Committee. 29 (3). Retrieved March 10, 2008 from:
       http://wwwstatic.kern.org/gems/ccsesaCisc/Just.29.3.pdf
Department of Education, United States of America. (2005, June 13). No Child Left Behind,
      Supplemental Educational Services Non-Regulatory Guidance. Retrieved March 5, 2008
      from http://www.ed.gov/policy/elsec/guod/suppsvcsguid.doc
Fusarelli, L. (2007, January). Restricted Choices, Limited Options-Implementing Choice and
       Supplemental Educational Services in No Child Left Behind. Educational Policy, 21(1),
       132-154. Retrieved February 2, 2008, from Academic Search Premier database.
Miners, Z. (2007, May). SES Effectiveness Is a Matter of Debate. District Administration, 43(5),
       18-18. Retrieved February 2, 2008 from Academic Search Premier database.
Office of Research, Evaluation, and Accountability; Office of Extended Learning Opportunities.
       (2007). SES Tutoring Programs: An evaluation of year 3 in the Chicago Public Schools.
       Chicago: Chicago Public Schools, Office of Research, Evaluation, and Accountability;
       Office of Extended Learning Opportunities. Retrieved February 3, 2008 from:
       http://www.cpsafterschool.org/SESreportyear3.pdf
Pascopella, A. (2004, June). Signs of Improvement with SES. District Administration, 40(6), 21-
      21. Retrieved February 3, 2008 from Academic Search Premier database.
Potter, A, Ross, S, Paek, J, & McKay, D. (March 2007). Supplemental educational Services:
        2005-2006 (2004-2005 Student Achievement Results). Memphis, TN: Center for
        Research in Educational Policy.
Robelen, E. (2004, September 8). Ed. Dept. Says States Set Rules on Tutoring. Education Week,
      24(2), 34-34. Retrieved February 2, 2008, from Academic Search Premier database.
Robelen, E. (2007, April 25). House Panel Examines NCLB Supplemental Services. Education
      Week, 26(34), 23-23. Retrieved February 2, 2008, from Academic Search Premier
      database.
                                                                       SES Effectiveness 27


Sunderman, G. (2006, October). Do Supplemental Educational Services Increase Opportunities
      For Minority Students?. Phi Delta Kappan, 88(2), 117-122. Retrieved February 2, 2008,
      from Academic Search Premier database.
Viadero, D. (2007, June 13). Evidence Thin on Student Gains From NCLB Tutoring. Education
       Week, 26(41), 7-7. Retrieved March 5, 2008, from Academic Search Premier database.

								
To top