Credit Recovery in an Online Summer School Program

Document Sample
Credit Recovery in an Online Summer School Program Powered By Docstoc

Credit Recovery in an Online
  Summer School Program
   By Jesse Bjorngjeld & Robert Conrad
Table of Contents


Literature Review…………………………………………………………………………...2

Contextual Factors……………………………………………………………………….…8





Appendix A: List of Classes…………………………………………………………..…28

Appendix B: Parent Opt-out Letter…………………………………………………29

Appendix C: Student Survey (PDF)

Appendix D: Teacher Survey (PDF)

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report       Page ii
Like many school districts throughout California, the Valley Center-Pauma Unified School District
(VCPUSD) finds itself facing budget challenges unlike any it has experienced before. With dwindling
resources, increased class sizes, and declining enrollment, administrators are compelled to think about
new ways of approaching the business of school (from scheduling classes, to offering extracurricular
activities). One such challenge is the creation of an effective summer school program aimed at
remediating failing grades for children who are struggling in core classes such as English and math.

This summer, the district tried a new approach. Abandoning the “usual” model of five-day-a-week
classes on campus, the 2010 high school summer program moved to an online (and mostly) self-paced
model of study. Working independently for the most part, students were proctored by a small group of
teachers who agreed to meet with them on campus for four hours each week to both access their
progress and provide assistance. Given its “experimental” or trial nature, the district specifically selected
160 high school students to participate in the four-week program.

All of the coursework took place in ALEKS and Education 2020 (E2020), two commercial online learning
and assessment products. Each student began the course by taking an online pretest, and then
continued through the learning modules for four weeks, working mostly at home. Once a week, the
students attended a four-hour session at the high school with a proctoring teacher who offered
academic and technical assistance, checked on student progress, and administered an online final exam
(posttest) at the end of the coursework. Teachers then posted final grades based on course completion
and posttest scores.

Researchers Robert Conrad and Jesse Bjorngjeld attempted to measure the effectiveness of this
program by comparing pretest and posttest scores in each of the courses offered to students (17
courses in 4 content areas altogether), and examining passing rates. The researchers also sought the
personal reactions of students and teachers participating in the summer school program through a
voluntary online survey. Two general questions the researchers hoped to answer were:

        How does enrollment in an independent, online learning program affect the overall
         performance of students enrolled in a remediation summer school program?

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                     Page 1
        What are the students’ and teachers’ personal perceptions of this approach to learning and its
         effectiveness as an instructional intervention?

Literature Review
Summer school is a long-established part of the American public school system. Originally an extension
of the academic year that promised enrichment and opportunity during the traditional “vacation”
months, its purpose changed dramatically after the 1983 publication of A Nation at Risk: The Imperative
for Education Reform by the National Commission on Excellence in Education. The commission was
charged with defining the problems afflicting American education. Its challenge was to provide solutions
to these problems in the context of a strong belief at that time that our society was being eroded by a
rising tide of mediocrity. The commission cited lowered scores on the SAT and other standardized tests
as indicators of a pervasive problem.

According to Borman (2001), summer programs were dramatically affected by the “back to basics”
restructuring that A Nation At Risk proposed. Most obvious was their shift in purpose—from enrichment
and acceleration to extended learning and remediation.

Today, the one “factor” that most distinguishes a school’s success is performance on standardized
achievement tests (for example: the California High School Exit Exam, CAHSEE, and the California
Standards Tests, CST) which students take each spring. With dwindling budgets, many districts are
turning to the Internet as a way to manage costs. While online learning is certainly not free, it can free
up dollars that would otherwise go to normal overhead expenses. Offering a higher student-to-teacher
ratio, online learning may reduce the number of teacher salaries paid by the district. In addition,
students working off-campus reduce the need to operate and maintain classrooms and other facilities at
the school site five days a week. But online programs will likely require students to perform more
independently and be more responsible for their own success. Teachers may redefine their role as
instructional leaders and facilitators.

Our literature review looked at summer school programs with an emphasis on three areas:

        What is the effectiveness of summer school at reaching underperforming students?

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                     Page 2
        Are online learning systems more or less “effective” than traditional approaches?

        What are the students’ perceptions of online learning?

A Brief History of Summer School

Originally, summer school was the product of a traditional school calendar heavily influenced by the
needs of an agrarian society. Since the late spring, summer and fall were critical working months for
family farms, school was not normally in session during these times. Despite urban growth and
industrialization over time, the traditional school calendar remained intact (Borman, 2004; Cooper,

During the “cold war” years of the 1960s—and mostly as a response to the perceived Soviet threat of
nuclear proliferation and dominance in the space race—the federal government infused money into
accelerated math and science programs. Summer school programs became part of the movement,
encouraging students to strengthen their science and mathematics skills through increased study.
(Borman, 2004)

Nevertheless, the traditional school calendar remained largely unchanged. Over the years, attempts
were made to introduce year-round education in response to the anachronistic school calendar.
However, no widespread national changes have occurred.

Today, new accountabilities challenge the public school system. Mandatory state testing and grade-
specific standards are confronting districts nationwide. School administrators can no longer afford to
view summer school programs in a casual light. Attempts are being made to create various extended
learning opportunities for students throughout the calendar year, including summer “bridge” programs
and summer school remediation programs. (Borman, 2004; Borman, 2001; Kops, 2010; Vilella-Velez,

The Effectiveness of Summer Programs

It appears that summer school is a successful intervention for students who require some form of credit
recovery or remediation. Realistically, however, the validity of this assertion depends on the method of
evaluating success.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                    Page 3
When measuring the success of any learning program it is important to look at what type of assessments
are being used. Although the most common form of assessment in summer school programs is
traditional letter grades, many current summer programs include new considerations. Among these are
end-of-grade tests, end-of-course tests, online instructional programs, and student assessment
interviews. Standardized tests now hold more weight than ever in the eyes of administrators, politicians,
and the general public as they have become a required measure of school success. (Aiken, 2004;
Baenen, 2000)

Students who attend summer school for remediation are attempting to recover credits for classes they
failed during the regular academic year. Below is a review of how four different researchers react to
and have examined summer programs with a focus on remediation.

        Walker & Vilella-Velez (1992) detail a test program—Summer Training and Education Program
         (STEP)—launched in 1984, shortly after the publication of A Nation at Risk. STEP was designed to
         help 14 and 15-year-olds from poor urban families who were lagging behind academically. The
         program linked half-day remedial intervention in math and reading with half-day summer jobs.
         The results of research on nearly 5,000 students participating in Fresno, CA, Boston, MA,
         Portland, OR, San Diego, CA, and Seattle, WA, showed test scores in reading and math standards
         a half grade higher than similar control groups receiving only summer jobs. Participants also
         demonstrated improved school attendance and more responsible sexual and social behaviors.
         Approximately 75% of the participants returned voluntarily the following summer for the same
         STEP intervention of reading and math remediation and part-time employment. STEP has been
         replicated in 15 states serving more than 20,000 students throughout the United States. Much
         time has passed since this article was published; are these stats still accurate?

        A report by Baenen & Lloyd (2000) focused on students in the Wake County Public School
         System (WCPSS), North Carolina, who failed Algebra 1 in 1994-1995, and retook the course
         during summer school; these children outperformed those who chose instead to repeat the
         course during the school year. Other success factors of interest included student pass/fail rates,
         end-of-course tests, and final letter grades. A regression analysis (accounting for both
         performance and other elements) thus suggested that students who failed Algebra 1, and who
         took the course during summer school were more likely to pass the class than those repeating
         the course during the regular academic year.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                   Page 4
        A study by Haenn (2001) reported on 1300 students attending a mandatory summer school
         program in Durham, North Carolina; those children went from low achievement levels to grade
         level in less than six weeks. Eighth-grade students who did not perform at grade level, based on
         end-of-grade tests in reading and math, were required to attend mandatory summer school
         before entering high school. An end-of-grade test was administered again when the summer
         session concluded and students were promoted based on test scores and the discretion of a
         Student Assessment Panel. In determining student success, outcomes indicated that those
         students needing remediation in math instead of reading and math were more successful in
         improving their grades.

        Aiken (2004) examined the effects of a Fairfax County, Virginia high school summer remediation
         program, focused on gains on standardized tests in six core classes: Algebra I, Biology,
         Chemistry, Geometry, World History, and Geography II. Gains were reported the following year
         as most students regularly attended classes; however, significantly higher gains were
         accomplished by students who attended the summer remediation program. In the
         Commonwealth of Virginia, high school students must receive a passing grade in these core
         courses and a passing score on End-of-Course Standards of Learning tests (EOC SOL) in order to
         receive a high school diploma.

Online Learning for Remediation

A growing body of research suggests that online learning programs “work” for underperforming
students. Often more cost-effective than traditional classroom instruction, school districts are
increasingly looking to alternative options for course delivery which includes various online instructional
programs. Following are examples of current thinking in this area.

        Humphrey (2006) argues many of the students who are candidates for online credit recovery
         programs simply do not possess the internal motivation and ability to manage their own
         learning. This places an exceptional emphasis on the need for support systems that include a
         comprehensive orientation to successful e-learning as well as instructional facilitation
         throughout the process.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                    Page 5
        According to Seppala, Xambo, and Caprotti (2006), a fundamental component of distance
         education is the communication medium. When relying solely on e-communication, the courses
         offered and the learning environment should stimulate interpersonal contact in order to
         motivate participants to remain engaged. Fortunately a host of communications devices are
         offered to the modern day internet user to facilitate these open lines of communication.

        According to Tucker (2007), schools programs built around online instruction can erase the
         perceived artificial boundary between learning that takes place during the school day and that
         which occurs outside of regular school hours. One of these online programs is proud of their
         motto, “anytime, anyplace, any path, any pace,” the Florida Virtual School prides itself in making
         online instruction available to students 24 hours a day, seven days a week.

        According to MacDonald (2008), the use of online instruction allows for both distance and
         campus-based learning environments to be run simultaneously. As environmental barriers are
         removed, teachers are able to become more engaged in supporting learners with tools not
         previously available such as online media communication tools. The ability to blend online
         learning and face-to-face curriculum provides opportunities for repetition of information when
         the student has access to technology.

        According to Kops (2010), a new weight rests on the shoulders of teachers who facilitate online
         learning. Technology will not automatically do the job of remediation without a well-planned
         and well-maintained curriculum plan. Because students do not have as many opportunities to
         connect with instructors in summer session as they doing during the traditional school year, best
         practice instructors will plan longer and more frequent open access hours and stay after class to
         assist students on a regular basis. Communication should be a major priority in the planning and
         implementation of any online remedial course with interactivity a vital concern.

Students’ Perceptions of Online Learning

Because motivation weighs heavily on the learner, student perceptions of the online learning tools at
their disposal factor heavily in the evaluation and reflection stage of course building. These responses
help teachers and administrators decide on important software options as future plans are addressed. A
review of literature indicated these trends:

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                    Page 6
        According to Seppala, Xambo, and Caprotti (2006), the use of ALEKS online math instruction
         software (see: resulted in mostly favorable feedback from 67
         students surveyed in Holland. Attending a remedial summer math course at Maastricht
         University, the students completed a written evaluation of the course and their online
         experiences. High scores (indicating strong agreement) were recorded in responses including,
         This summer course offered me a lot, The summer course was well organized, and It was good
         that I could work on the subject matter at my own pace. Low scores were recorded (indicating a
         strong disagreement) on the question, “Learning in an e-learning environment as ALEKS is not
         different from learning from a hard-copy book.”Overall, the average student rating of the
         quality of support during the program was 8.6 on a 1-to-10 scale of one-to-ten, where 10
         represented the highest positive score.

        According to Trotter (2008), administrators of the Florida Virtual School, a state-run online
         program for grades 6-12, found 17 percent of its 2008 high school students enrolled for credit
         recovery. Students are not resisting online learning, but quantifying this cultural shift with data
         is difficult. Under pressure to graduate, students are turning to online courses which help them
         to recover lost units, a result of failing classes. Nationwide, technology-based options for credit
         recovery are expanding.

        According to a study by Hopper & Harrington (2009), responses to online instruction have been
         generally positive. In St. Clair County, Michigan, 1100 students were asked to respond to the use
         of E2020 online instructional software (see: in a credit
         recovery program and as part of regular classroom instructional support. A majority of the
         students indicated a very positive response to the 24/7 availability of the program and the
         software’s ability to customize instruction for every student after pre-testing.

        According to O’HanIon (2009), many students are beginning to believe that the traditional
         classroom does not meet their needs. In Denver, Colorado, a virtual Online High School hosted
         150 full-time students who chose it over a traditional setting. An alternative to traditional
         summer school, this online program offers students an opportunity for credit recovery during
         the regular school year. A similar program in the Volusia County School District in Florida offers
         over 30 credit recovery programs for full-time and part-time students. Students and teachers
         both believe that the video and internet content offered help participants comprehend subject

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                       Page 7
         matter they didn’t grasp the first time around. Students see themselves as masters of their own
         pace and their own success. Student confidence is reportedly increased as students feel more
         empowered about their education.

Contextual Factors
Several factors may constrain the results of this study; each is detailed in the subsections that follow.

Differences in Academic Subjects

The online instructional program at Valley Center High School (VCHS) featured 17 different classes or
“tracks” identified by specific subjects and grade levels; examples include English 10A, U.S. History 11B,
and Algebra IA (see Appendix A for a complete list). Although every class included a pretest, posttest,
and final grade, the number of students attending each class/subject varied widely and few students
worked at the same level or on the same material simultaneously.

Test Subject Characteristics

All of the students attending the summer school program were purposefully selected by the district
based on failing grades earned during the regular school year. Therefore, the summer school population
cannot be looked upon as a sampling of the greater school population. Indeed, their demographics,
academic history, and behavior may differ from the general school population. In addition, the number
of students attending who were designated as English Language Learners (ELL) (n19) was too small to
effectively manipulate for this study. A portion of the ELL group, those attending the English Transition
class, was not required to take the posttest.

Technology Considerations

The district selected two different online instruction packages for use in the summer school program:
ALEKS, specifically for online math instruction, and E2020 for English, health, and history instruction.
Though both are “organized” around pretest/posttest assessments, reporting on the efficacy of these
programs was not a goal of this study. The researchers must assume that these services were carefully
chosen by the district as the most appropriate for this situation.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                     Page 8
Both on campus and off, students worked with different brands and models of computers, representing
different operating systems. This study was not intended to examine any of the effects these factors
may have had on the students’ outcomes. The researchers must assume that the students were given
adequate tools and support to succeed in this program.

Voluntary Survey Constraints

Teachers cooperated in offering class time for students to complete the voluntary online survey during
the last on-campus meeting; approximately 43% of the students responded. While this is an excellent
return rate, the researchers recognize that results may not fully represent the larger population of
attendees. The researchers chose to administer the survey on the last day of the program in order to
receive thorough feedback about the entire process. However, survey data was collected before
posttest scores and final grades were examined.

In this study, the researchers intended to explore two key areas of information: 1) the actual test scores
and final grades indicating academic success or failure, and 2) the perceptions of those participating in
the program, both teachers and students. Different approaches and different collection instruments
were required to coordinate each priority.

Test Scores
Students in every class completed pretests and posttests. Our first challenge, then, was to collect the
pretest and posttest scores for each student. Unfortunately, access to these numbers was different for
both online programs. The ALEKS program is capable of creating and printing spreadsheets with this
information for each specific class. (See sample 1)

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                     Page 9
Sample 1: An ALEKS spreadsheet with pre-test (goal% before) and post-test (goal% after) scores

For the E2020 program, the researchers had to log on to the program and record scores for each student
(one at a time) from an electronic progress report. This proved very time consuming as each score was
hand-recorded, then keyed into an Excel spreadsheet. (See samples 2 and 3)

Sample 2: In World History 10A class, this student earned a 51.6% on the pretest.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                               Page 10
Sample 3: This same student earned a 76% on the posttest.

Final Grades
Neither online program computed or recorded final grades. This was left to the discretion of the teacher
within the context of district policies. Therefore, the researchers relied upon the cooperation of
teachers to report the final grades assigned each student. No specific explanation accompanied the final
grades and none was sought for the study. The researchers compiled all of the student information into
one Excel spreadsheet containing all of the student information germane to the study, including pretest
and posttest scores, and final grades. Eventually this spreadsheet was imported into a statistical
software page (SPSS) to facilitate data testing and evaluation.

Data Testing
Mostly, the researchers looked at the mean score differences by performing the paired t-test analysis.
Beginning with overall scores for the aggregate population, the researchers then filtered the data to
examine and compare scores based on gender.

Next, data were filtered by subject area. Because the students were spread out among 17 different
classes (creating very small numbers in some cases), the researchers chose to group classes into one of
four basic subject areas: English, health, history, and math. Paired t-tests were used to examine mean

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                   Page 11
score differences (along with an array of descriptives) in each subject area. Attempts to further break
down subject areas by gender proved to be unreliable as some “cell” sizes were simply too small to
effectively manipulate.

Frequency distributions were calculated for final grades in an effort to determine how many students
“passed” their intended class and thereby recovered lost credits. Included in the examination were
those students who did not finish the program (about 13.6%).

Survey Data
In an attempt to examine the individual perceptions of both teachers and students in the program—and
more fully triangulate the performance data, the researchers created a voluntary online survey using the
professional version of SurveyMonkey—an online (and fee-based) survey generator (see: Two complementary surveys were created, one specifically for the students
(see Appendix B) and one specifically for the teachers (see Appendix C). Participation was strictly
voluntary for both groups and the survey was officially made available on the last meeting day.

To reiterate, the researchers believed that a survey offered at the end of the coursework would provide
a better overall picture of the complete online learning experience.

Responses were measured on a standard five-option agreement scale, where 1=Strongly agree and
5=Strongly disagree. To enhance the comparative value of the surveys (in essence, to “see” different
issues from both the student and teacher vantage points), the team created items that were as
identically worded as possible. For instance:

Students: “I had enough time to work on and complete my assignments.”

Teachers: “The students had enough time to work on and complete their assignments.”

The majority of responses were automatically assigned a numerical value by SurveyMonkey for testing.
The researchers examined the mean scores of student and teacher responses independently, and then
aligned their mean scores on similar questions in an effort to compare student perceptions with those of
the teachers.

At the end of each survey, the researchers provided a text box for participants to record comments
about their experiences in the summer school program. The researchers perused these comments

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                    Page 12
carefully, looking for trends and comparing results by gender. From a qualitative standpoint, the
researchers hoped to compare the survey results between the students and the teachers, looking for
any significant differences or similarities.

Opt-Out Letters

At the beginning of the program, a letter was sent home with every student to their parents. (See
Appendix D).This letter introduced the researchers to the families and explained the purpose of the
study. Every parent was given the opportunity to “opt out” of the study by signing and returning the
letter to their child’s teacher.

Population Overview

The test population consisted of ninth through twelfth grade students, all planning to return to the
traditional high school program in the fall. Students were “invited” to the summer school program as a
result of failing a class during the regular school year. Given an opportunity to recover lost credits, 132
students signed up for the program. Once started, 103 students (78%) chose to participate, with 29
failing to attend. The participating group consisted of 58 (56%) boys and 45 (44%) girls. Of this group, 19
(18%) were identified as ELL students. One student’s data was removed from the study in response to
the researchers receiving a parent “opt-out” letter. In the end, 14 students (13%) did not complete the

Test Scores

In the aggregate, a paired t-test comparison of test scores for all students indicated a statistically
significant difference in mean pretest scores (m=43.99) and posttest scores (m=69.59) at the .05 alpha
level. (See table 1)

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                       Page 13
Table 1: T-test comparison of pretest and posttest scores for all students
                                                       Paired Differences

                                                                      95% Confidence Interval of
                                                                            the Difference
                                             Std.        Std. Error                                                 Sig. (2-
                                Mean       Deviation       Mean          Lower        Upper        t      df        tailed)

Pair 1 PRE-TEST –             -25.59193     14.83642        1.58157     -28.73547     -22.44839 -16.181        87         .000

Subject Area Analysis

Test results were divided into four general subject areas: Math (n40), English (n26), Health (n12), and
History (n10). In the aggregate, a pretest one-way analysis of variance (ANOVA) indicates that the
Health students were the best prepared for their coursework. Their mean score of 59.80 was nearly
twice as high as the mean score that Math students earned (31.30). Also performing well at the pretest
were English students (m=55.08), while History students performed more marginally (m=39.09). (See
table 2)

Variability at the pretest was most dramatic for the Math students (sd=15.34) indicating a fairly dramatic
range of scores. Variability was slightly less for students in the Health (sd=14.38) and English (sd=13.34)
classes. History students (more marginal performers, as noted above) were a more homogenous group
with a standard deviation of 9.69.

In the aggregate, the posttest one-way analysis of variance (ANOVA) indicates an amazing improvement
for all groups. Final mean scores ranged from 65.20 for Math students (the lowest performers initially)
to 80.50 for those studying health (the highest performers initially). Variability decreased for all groups
as well, with Health students showing the most dramatic drop with sd=14.3 at the pretest (the second
highest) to sd=9.38 (the lowest). Both History and Math showed very little change in variability,
approximately 2%, from pretest to posttest. The Math group’s variability remained the highest
throughout the program. (See table 2)

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                        Page 14
Table 2: Aggregate Scores/ Pretest and Posttest by Subject Area

Subject Area    Pretest         Pretest Range        Standard.   Posttest    Posttest Range     Standard
                Mean            lowest/highest       Deviation   Mean        lowest/highest     Deviation
                                                     Pretest                                    Posttest

English           55.0811       29.63     78.72       13.34661      72.54      48         100     11.967

Health            59.8015       27.42     77.42       14.38814      80.50      66         96       9.386

History           39.0962       22.58     54.84       9.68797       66.40      56         88       9.466

Math              31.1250        6.00     71.00       15.34089      65.20      33         92      14.986

Gender Analysis

A pretest one-way analysis of variance (ANOVA) indicates that the mean score differences between boys
and girls was not significant (p=.452). As a group, boys reported a higher mean score (m=45.44) than the
girls (m=42.68). At the same time, girls showed greater variability (sd=19.04) than the boys (sd=17.68.)

A posttest one-way analysis of variance (ANOVA) indicates that the mean score differences between
boys and girls was not significant (p=.082). However, the mean difference widened from pretest to
posttest (2.76 to 5.16). Once again, boys reported a higher mean score (m=71.88) at the posttest than
girls (m=66.72), and girls continued to report greater variability (sd=14.90) than the boys (sd=12.60). As
a group, the boys continued to outperform the girls from start to finish. (See table 3)

Table 3: Aggregate Scores/Pretest and Posttest by Gender

Subject Area    Pretest         Pretest Range        Standard    Posttest    Posttest Range     Standard
                Mean            lowest/highest       Deviation   Mean        lowest/highest     Deviation
                                                     Pretest                                    Posttest

Male              45.4414        7.00     78.72       17.68948      71.88      42         100     12.601

Female            42.6816        6.00     77.42       19.04841      66.72      33         96      14.905

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                   Page 15
Final Grade Comparisons

One measure of success for the summer school program was in the final grades students received,
indicating whether or not they were able to recover lost credits for classes failed earlier in the year.
Incredibly, 79% of the participating students received a passing grade, based on a “passing” percentage
of 60 or higher. This passing rate clearly defined a successful outcome for those students who
completed the program and achieved their academic goals. (See table 4)

Table 4: Final grade distributions for all students

Final Grades               A                    B               C                  D              F / inc

# of Students              14                  18              30                 19                21

% of Total                13.6                17.5            29.6               18.6              20.5

Overall, 95 students participating in the program (93%) received a final grade. The remaining seven
received a “no mark” status, indicating a special arrangement with the school district to independently
complete the course during the fall semester. The reasons for these “no marks” were not shared with
the researchers.

Survey Respondents

Surveys can be helpful tools in exploring the perceptions of participants in a program. The researchers
wanted to answer the basic question, “What are the students’ and teachers’ perceptions of online

It was the first year that the Valley Center-Pauma Unified School District ran such a program, making
qualitative feedback very valuable. The researchers created four general question sections to survey
both the students and the staff. (The teacher version used language that spoke to the teacher’s
perceptions of the student experience.):

    1. How You Approached the Course
    2. Your Perceptions of the Course
    3. Assessment of Personal Performance
    4. Written Response to an Open-Ended Question (See Appendix C and D)

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                    Page 16
How You Approached the Course:

The researchers used a standard Likert scale offering five nominal responses: Strongly Agree(1),
Agree(2), Neither Agree nor Disagree(3), Disagree(4), Strongly Disagree(5). The most common mean and
mode score in this first section was a two. Students and teachers alike seemed to agree on a number of
aspects regarding the program such as:

          Students understood how to succeed in the course.
          Students had enough time to be successful.
          Students had the ability to manage their time.
          Students found the class interesting.
          Students were independent learners.

     Table 5: A comparison of student (ST) and teacher (T) responses on section 1 of the survey
                                                                         ST     T    ST     T    ST      T
                                                                         Mean   Mean Mode   Mode Std.    Std.
Student Survey Question                                                                          Dev.    Dev.

I understood how to succeed in this course--what I was expected to do,   1.75   2.00   2     2    .866    .000
including due dates/deadlines, etc.

I had enough time to work on and complete my assignments.                1.91   1.80   2     2    .830    .447

I was able to manage my time, and keep on track with my assignments.     2.18   2.20   2     2    .947    .837

I was motivated to do well in this class.                                1.80   2.00   2     2    .734    .707

It was interesting to take a class online.                               2.02   1.80   2     1    .963    .837

I'm able to work well on my own; I'm a strong independent learner.       2.07   2.20   2     2    .884    .447

Your Perceptions of the Course:

A larger variation existed in student and teacher responses in this section. This was not surprising for a
high school setting. Regarding software usability and whether the students were urged to ask questions
about the course content, teacher and student responses were closely correlated within the Agree(2) to
Strongly Agree(1) range. On questions regarding the students’ or teachers’ level of responsibility,

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                       Page 17
correlations were more diverse. In other words, both groups seemed to shift the blame when integrity
was called into question.

For instance, the students did not feel nearly as confident as the teachers did about whom to go to with
their technical questions. Similarly, teachers felt that they encouraged the students more than the
students felt encouraged. In addition, the students did not feel as strongly about their confidence in the
course as the teachers thought they did. When asked whether or not the course difficulty level was just
right, neither the teachers nor the students chose to Agree(2) or Strongly Agree(1) with the statement.
In fact, the teachers displayed less confidence regarding the course’s difficulty level.

              Table 6: A comparison of student (ST) and teacher (T) responses on section 2 of the survey

                                                                          ST     T    ST     T    ST        T
                                                                          Mean   Mean Mode   Mode Std.      Std.
Student Survey Question                                                                           Dev.      Dev.

The software was easy to navigate; I always knew what I was supposed to   1.75   1.20   2     1      .686    .447

I knew whom to go to with any technical questions I might have.           1.95   1.20   2     1      .834    .447

My teacher encouraged me to do well.                                      1.77   1.40   2     1      .642    .894

My teacher urged me to ask questions about the course content or the      1.98   1.80   2     1      .902   1.304

The course seemed right to me--not too easy but not too hard.             2.21   2.60   2     2      .989   1.342

The course content was easy to understand; I could easily follow it.      2.02   1.60   2     2      .821    .548

Assessment of Personal Performance:

In this section the fourth and fifth questions were stated in a negative way, straying from the
convention. The researchers also changed the scale to a nominal true-to-false scale: Completely True(1),
Somewhat True(2), Not Sure(3), Somewhat False(4), Completely False(5). Similar to the previous section,
the teachers and students did not share the same opinions about the statements that seemed to call
their integrity into question. Again, it seemed that a bit of blame shifting was taking place. Neither the
students nor the teachers wanted to indicate that it was their fault that the other did not do what was
expected of them. The most pronounced difference emerged when both groups were asked about
whether the students completed the reading that was assigned to them. The teachers were not sure,

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                          Page 18
while the students were very confident that it was true that they read all the materials the teacher gave
them. The students also indicated that they were more bored with the online learning process than the
teachers perceived.

Table 7: A comparison of student (ST) and teacher (T) responses on section 3 of the survey

                                                                          ST     T    ST     T    ST      T
                                                                          Mean   Mean Mode   Mode Std.    Std.
Student Survey Question                                                                           Dev.    Dev.

I tried my very hardest in this class.                                    1.60   2.20   2     2    .660    .837

I asked my teacher questions (whether about the software or the content   2.26   1.80   2     2   1.136    .447
itself) whenever I had them.

I read all the materials my teacher gave me.                              1.72   2.80   1     3    .908   1.304

I was bored by learning online; it just didn't hold my attention.         3.07   4.00   2     4   1.470    .707

I had trouble keeping up with my assignments; it was hard to manage my    3.07   3.20   5     3   1.502    .837

Standard Deviation and Frequency Distribution:

Variability was low for most of the questions on the teacher and student surveys. Because of the small
sampling size of the teacher group, variability was not closely examined.

An examination of the frequency distributions for each response revealed two unusual intervals on the
student survey:

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                        Page 19
    1. I was bored by learning online; it just didn't hold my attention.

                                       Frequency         Percent     Valid Percent     Percent

     Valid      Completely true                      7        15.6             16.3              16.3

                Somewhat true                    12           26.7             27.9              44.2

                Not sure                             6        13.3             14.0              58.1

                Somewhat false                       7        15.6             16.3              74.4

                Completely false                 11           24.4             25.6          100.0

                Total                            43           95.6            100.0

     Missing    System                               2         4.4

     Total                                       45         100.0

This frequency distribution paints the most vivid picture of the level of controversy that exists between
the students. It would be far too difficult to say that the online learning program as a whole was able to
capture the student’s attention and keep them from getting bored. The same would be true for the
following question.

    2. I had trouble keeping up with my assignments; it was hard to manage my time.

                                       Frequency         Percent     Valid Percent     Percent

     Valid      Completely true                      8       17.8              18.6              18.6

                Somewhat true                   10           22.2              23.3              41.9

                Not sure                             8       17.8              18.6              60.5

                Somewhat false                       5       11.1              11.6              72.1

                Completely false                12           26.7              27.9          100.0

                Total                           43           95.6             100.0

     Missing    System                               2         4.4

     Total                                      45          100.0

Inversely, the lowest standard deviation for the questions in the student survey showed a low level of
controversy. The following two questions had the lowest standard deviation.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                      Page 20
    3. My teacher encouraged me to do well.

                                       Frequency     Percent   Valid Percent    Percent

     Valid      Strongly Agree             15         33.3         34.1           34.1

                Agree                      24         53.3         54.5           88.6

                Neither agree or           5          11.1         11.4          100.0

                Total                      44         97.8        100.0

     Missing    System                     1           2.2

     Total                                 45        100.0

The teachers should feel satisfied knowing that not a single student disagreed or strongly disagreed with
this statement. This seems to indicate a low level of controversy on this topic. In addition, a low level of
controversy occurred when students were asked about their effort. The following student question and
frequency distribution chart show the progress.

    4. I tried my very hardest in this class

                                       Frequency     Percent   Valid Percent    Percent

     Valid      Completely true            20         44.4         46.5           46.5

                Somewhat true              21         46.7         48.8           95.3

                Not sure                   1           2.2         2.3            97.7

                Somewhat false             1           2.2         2.3           100.0

                Total                      43         95.6        100.0

     Missing    System                     2           4.4

     Total                                 45        100.0

Written Response to an Open-Ended Question:

The open-ended questions elicited an interesting set of responses from the students and teachers.
These responses displayed some interesting diversity of opinions, with the students stating dramatically

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                   Page 21
different positive and negative comments. The teachers took the time to articulate more positively and
thoroughly, while the students chose brevity.

Table 8: Student responses/numbers and percentages

Positive student      % of positive         Negative student   % of negative     % who made no       N/A Comments
comments              comments              comments           comments          comments

n22                   46%                   n8                 18%               36%                 n3

Table 9: Teacher responses/numbers and percentages

Positive teacher comments       % of positive        Negative teacher comments   % of negative   % who made no
                                comments                                                         comment
                                                                                                 (1 teacher)

n4                              80%                                              n0              20%

Generally Positive Student Comments: (displayed as they were written)

         It was a great learning experience!
         This was an interesting course to take.
         The only problem I had was that the course was a bit too easy for me because I am in the upper
          level (AP, Honors) classes. But for those that aren't, I think it would have been fine.
         the course was extremely easy and i passed with no trouble at all.
         I liked the online classes because it was in my rhythm. I understood most of the problems with
          the explain notes.
         It was nice to be able to take classes during the summer, instead of during the year. The
          program was a little tricky to follow, and sometimes the explanations did explain very well.
         It was easy to manage. Being at home is a distraction.
         it was a different learning expereience that was new and interesting
         i liked the ''explain'' button and i liked how how it gave me the options to do on a selection of
         It was a very helpful learning experience. If I had more time it would make it less rushed

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                                  Page 22
        I thought the course was planned out very well. The course offered me knowledge on the course
         as well as help when I needed it. I liked the online classes and would like to use them more if I
        Hi Mr.Conrad! I really loved the way they planed out summer school for us this year. I really
         liked the fact that I could work on my own and not have to go along with the rest of the class. I
         was able to get a lot done. I believe that it took a lot of self motivation. But thats about it. <3
        it was somewhat boring, but it helped. some of the assesment were long and difficult. it was
         easy to understand what I needed to do
        Very easy to understand, simple processs to follow, needs new more lecturers.
        this was a very good form of educastion.
        It was someone what diffcult to to stay focused and complete assigments. there was too much
         work for just making your class percent progress go up, but other than that it was a little better
         than having a week long class.
        The online class was a little bit boring, but it was nice to be able to do it independantly.
        its awsome
        The course was easy in general, it was easy to understand and fast. If we kept up and did at least
         the designated amount of work each week it was a easily passable class.
        boring but worth it.
        It was an OK class I enjoyed most of it only that I needed help on the worksheets and my
         internet is slow at home so I could not work on much i was able to finnish it though

Generally Negative Student Comments: (displayed as they were written)

        i thought that this course was some what easy. but some of the examples didn't help at all. i got
         stuck a lot and didn't know how to do the problems.
        the course was easy to use but when you got stuck on a problem you couldn't move on.
         Sometimes it was hard to understand the example that they gave you.
        hate aleks,something els than this is better.
        this is easy.
        it was kind of boring. it took a long time to get through some of the assignments. it was kind of
        boring, hard, long

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                       Page 23
        interesting but not enough time
        it was hard but i manages to complete it on the next school yr that is coming,,

N/A Student Comments (12 students did not comment)

        It was fast.
        this is a test test comment also a test comment
        should have passed it the first time

Teacher Comments:

        I think that as a credit recovery tool, the program worked quite well. I certainly think that if we
         continue to offer this program with some serious refinement, it could be a successful
         educational tool.
        The online Health course worked well. The CAHSEE Language Arts took more motivation and
         follow up phone calls. The online courses seem to be better for personal enrichment.
        This was our first year of online summer school. Overall, I think it went better than expected. I
         ended up with a much higher completion/pass rate in my classes than I expected at the
         beginning. I think most of the students responded well to the online program.
        For those who are truly motivated, it makes it easy to make up for mistakes during the school
         year (i.e. not completing homework, essays, etc.) In other words, they learned the material but
         had gaps and were lazy for most of the year. It would be difficult to take this course in 4 weeks
         for anything other than repeating the class. Overall it went well.

Data Overview
Based on its initial goal, the Valley Center-Pauma Unified School District achieved success in its attempt
to help students recover lost credits through the use of an online learning program. A statistically
significant increase in test scores in all of the content areas helped lead to this success. However, other
interesting outcomes can be inferred from the data.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                     Page 24
In terms of growth, the boys generally seemed to do better than the girls. In contrast, the girls
responded more positively to the program overall in their written comments. Teacher perceptions were
generally positive, although noticeably different from those of the students in a few distinct areas.

Modest growth was achieved overall in Math and History classes, though Math students displayed a
high passing rate. This may be due in part to redundancy, as Math skills are practiced throughout the
school year. English classes, perhaps more subjective and less reliant on “right answers”, showed more
modest growth and passing rates.

Health class, albeit a very small group, showed stellar growth in this program. Required by the state of
California for all high school students, this type of online instruction may prove to be a valuable cost-
saving tool to meet this requirement. The researchers realize, however, that this group does not
represent a large enough sample to support any grand assumptions.

Survey Overview
A respectable number, 43%, of the student participants responded to the voluntary survey. Despite
some fluctuations responses were generally positive. For reasons undetermined, girls generally
responded more positively to the online learning experience than the boys. Teacher perceptions were
generally positive, although noticeably different from those of the students in a few distinct areas.
Most participants seemed encouraged by the results, particularly with regard to online learning, and
anticipated continued use of this type of instruction in the future.

Future Survey Suggestions
Teachers acknowledged a need to refine or improve the implementation of this program in the future.
This may speak more to the organization of its delivery than its effectiveness as a learning tool. Some
future survey considerations which this study did not cover are:

        What challenges prevented those who did not complete the program from doing so?
        Are ELL students encouraged or dissuaded from using online instruction programs?
        Do differences in computer workstations affect student perceptions of their learning?

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                      Page 25
        Why do girls respond more positively than boys to online learning?
        How might teachers improve their support systems for independent online learners?
        Can an online learning program be effectively implemented during the traditional school year?


Aiken, B. (2004). The effect of a county's public high school summer remediation program on student
        gains on end-of-course standard of learning tests in Algebra I, Biology, Chemistry, Geometry and
        World History and Geography II. (Ed.D. dissertation, University of Virginia, 2004). Retrieved June
        26, 2010, from Dissertations & Theses: A&I., Publication No. AAT 3135014

Baenen, N., & Lloyd, W. (2000). Is summer school effective for remediation in Algebra I? Research
       Watch. (E&R Report No 0025). North Carolina.

Borman, G., & Boulay, M. (2004). Summer learning: Research, policies, and programs. Mahwah, NJ:

Borman, G., Slavin, R., & Stringfield, S. (2001). Summers are for learning. Principal, 80(3), 26-29.

Caprotti, O., Seppala, M., & Xambo, S. (Eds.) (2006). Web Advanced Learning Technologies Proceedings.
        Netherlands. January 5-6, 2006, 27-36.

Cooper, H. (2003). Summer Learning Loss: The problem and some solutions. ERIC Digest ED 475391

Haenn, J. (2001, Month). The effectiveness of summer school in getting students to function on grade
       level in gateway grades. Paper presented at the Annual Meeting of the American Educational
       Research Association, Seattle, WA. (ERIC Document Reproduction Service No. ED458704)

Harrington, T. & Hopper, J. (2009). Empowering 21st century learning: The role of a RESA in supporting
        digital transformation. Perspectives, 15(Fall), 51-58.

Humphrey, B. (2006). E-Learning and credit recovery: Strategies for success. Retrieved on June 25, 2010

Kops, B. (2010, May). Best practice for teaching in summer session. Education Digest, 75(9), 44-49

Macdonald, J. (2008). Blended learning and online tutoring: planning learner support and activity.
      Burlington, VT: Gower Publishing Company.

National Commission on Excellence in Education. (1983). A nation at risk: The imperative for
       educational reform. April, 1983.

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                     Page 26
O’Hanlon, C. (2009). Credit recovery software: The new summer school. T H E Journal, 36(2), 16-19.

Trotter, A. (2008). Online options for ‘credit recovery’ widen. Education Week, 27(38), 1-12. Retrieved
         June 24, 2010, from Academic Search Premier Database

Tucker, B. (2007, June). Laboratories of reform: Virtual high schools and innovation in public
        education. Education Sector Report No. 6.

Vilella-Velez, F. & Walker, G. (1992). Anatomy of a demonstration: The Summer Training and Education
         Program (STEP) from pilot through Replication and Post program Impacts. (ERIC document
         reproduction service No. ED343981)

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                 Page 27
Appendix A: List of Classes

Valley Center High School

Summer School Class List 2010
English 10A

English 10B

English 11A

English 11B

English 9A

English 9B


Transition English

U.S. History 11A

U.S. History 11B

World History 10A

World History 10B

Algebra 1A

Algebra 1B-1

Algebra 1B-2

Geometry A

Geometry B

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report   Page 28
Appendix B: Parent Opt-out Letter
     San Diego State University             Parental Permission Form        Summer School Success

Your child is being asked to participate in a research study about his/her participation in the
Valley Center-Pauma 2010 summer school program. It is important that you read the following
information and ask any questions you might have to be sure you understand how your child will
be participating.

Investigators: This research is being conducted by San Diego State University (SDSU) graduate students,
Robert Conrad and Jesse Bjorngjeld. Both are full-time teachers working toward masters degrees through
the educational technology department at SDSU. Robert Conrad is a full-time teacher at Valley Center
High School. Jesse Bjorngjeld is a full-time elementary school teacher in San Jose, CA.

Purpose of the Study: The purpose of this study is to track the success of the new Valley Center High
School summer school program using computer-based learning. As a service to the district, the
researchers will report on overall student attendance and completion, as well as student reactions to this
new approach to summer school.

Description of the Study: Data will be gathered about attendance, participation, and grades from the
computer program itself. Students will not need to fill out any reports or talk with any researchers.
Students will be asked to voluntarily fill out a five-minute online survey where they will have the
opportunity to comment on the new program. Teachers have agreed to allow in-class time for students to
voluntarily participate in this survey.

Confidentiality: Your child’s confidentiality will be maintained to the extent allowed by law. This research
report will not be published or made available to the public. The researchers’ work will be supervised by
SDSU Educational Technology Department Chair, Dr. Marcie Bober-Michel, and reviewed by Valley
Center-Pauma Unified School District (VCPUSD) Assistant District Supervisor, Mary Gorsuch.

Questions: If you have any questions about this study, please ask. You may contact VCPUSD Assistant
District Supervisor, Mary Gorsuch, at (760) 749-0464 or At UCSD you may
contact Dr. Marcie Bober-Michel at

Voluntary Participation: Participation in this study is voluntary. If you agree to allow your child to
participate you DO NOT need to return this form.

Only sign and return this form if you wish to remove your child from the research study. You may
return the form to your child’s teacher, return it to the Valley Center-Pauma district office, or FAX
it to the Valley Center-Pauma district office at (760) 749-0464 NO LATER THAN FRIDAY, JULY 2.

______________________________________________                      Name of Child (please print)

______________________________________________                      _____________________________
Signature of Parent/Guardian of Participant                         Date

ED 690/Summer 2010: Bjorngjeld/Conrad Final Report                                                     Page 29

Shared By: