Algebra_Paper

Document Sample
Algebra_Paper Powered By Docstoc
					                                College Algebra Course Redesign
                                    Evolution or Revolution?
                                    The process of change….

Paper presented at the:
International Conference on Technology in Intercollegiate Mathematics
Chicago, Illinois, USA
12 March 2010

Tammy Muhs

University of Central Florida
Department of Mathematics
4000 Central Florida Blvd.
Orlando, Florida 32816-1364
tmuhs@mail.ucf.edu




Abstract
In this paper, the effects of a College Algebra course redesign on student learning are investigated. College
Algebra was restructured to provide a small within large environment where students work in an interactive
online system and receive individualized instruction from instructors, undergraduate peer tutors, and graduate
teaching assistants. A faculty cohort who are focused, trained, and committed to teaching this General
Education Program (GEP) course provide sustainable, consistent instruction through course coordination. The
challenges of formal assessment in larger classes with lower budgets are addressed by the initiation of online
testing. Results from a successful pilot are described and areas for future work are identified.


Introduction: Why did we consider a mathematics course redesign?
College Algebra has consistently been the highest enrollment course in the University of Central Florida
Mathematics Department, with fall enrollments of up to 2250 students and annual enrollments in excess of 4000
students. Historically, College Algebra was taught in large lectures having 384 students per class, independent
sections of 49 students per class, and mixed mode sections with 21 students per class. The large lecture and
independent classes met three hours per week with an additional recitation hour for students in the large
lectures. The mixed mode classes met one hour per week, with an online component providing instructional
support for the remaining instructional hours.
Irrespective of format, there were issues with course drift due to the many different instructors teaching the
course. Additionally, our University was facing a shortage of classroom space and, as was typical across the
country, our state had mandated reductions in our departmental and support budgets. A benefit of the mixed
mode classes was a lower instructional cost per student and less classroom space requirements when compared
to the traditional format. However, the mixed mode classes had a withdrawal rate more than double the
traditional format classes, which contributed to an overall lower pass rate. We wanted our budget to benefit
from the cost savings of additional mixed mode classes, but we were not willing to do so at the expense of our
students. We needed a course redesign to address the success rate, classroom shortage, and drift issues while
servicing a large student population on a decreasing amount of funds.

Design Process: How did we redesign College Algebra?
The process was certainly one of evolution as opposed to revolution. Beginning in 2006, pedagogy changes
included the introduction of in-class quizzes and weekly online assignment deadlines for our mixed mode
College Algebra classes. In fall 2007, a partial redesign of College Algebra was completed. This partial
redesign retained approximately the same number of sections as the fall 2006 semester, with the addition of
course coordination and a change in the instructional mix. Specifically, the issue of course drift was addressed
during the course redesign by using a coordinated effort across multiple sections of the course. This was easily
facilitated with the use of the coordinator feature of MyMathLab to create and monitor assignments across
numerous sections, which were taught by a variety of instructors. The instructional mix was also changed to
result in a lower cost per student when compared to the previous fall semester. Despite these improvements the
pass rate was still hovering around 50%. Although this was an improvement over the 2005 pass rate of 35%, it
was clear that additional work needed to be done.
University of Central Florida was selected to participate in the National Center for Academic Transformation
(NCAT) Colleagues Committed to Redesign, Round II, course redesign. The modified Emporium Model was
selected for the second phase redesign of College Algebra with the pilot scheduled for fall 2008. Traditionally,
the student population taking College Algebra is very diverse in academic goals, background, schedules, and
abilities. The modified Emporium Model provided these students the opportunity to choose when to access
course materials based on their schedules, and which of the available instructional resources to use based on
their academic needs. The structure of the redesign included multiple smaller sections of the course being
combined into larger sections of between 300 and 384 students. In order to encourage active learning, enrolled
students were required to work in a designated lab for a minimum of three hours per week as part of their course
grade. As the Mathematics Department did not have a computer lab that would accommodate this many
students, a general purpose computer lab was used for the weekly required lab hours during the pilot. While in
the lab, students utilized the MyMathLab learning environment which has a variety of interactive materials and
activities including online homework, quizzes, lecture videos, worked examples, guided examples, practice
tests, and individualized study plans. The online coursework provided immediate feedback to students, which
has been shown to lead to increased student learning. Additionally, faculty, Graduate Teaching Assistants
(GTAs), Undergraduate Teaching Assistants (UTAs), and peer tutors were available to provide on-demand
personal assistance to students working in the lab. While the change in the instructional mix, the reduction in
number of sections from thirty-five to six, and the increase in section size from between 21-384 students to
sections of 300-384 students resulted in a reduction of cost per student for the College Algebra program, it was
projected that the teaching staff providing assistance in the lab would increase the cost per student in excess of
this savings.

Given the need to reduce the cost of the College Algebra program, and the challenges of formal assessment in
large auditorium classrooms, we initiated online testing for College Algebra as well as our other four GEP
mathematics courses during the fall 2008 semester. As we were spending approximately $7000 per year
printing College Algebra tests alone, it was determined that the savings from online testing for these five
courses would be used to not only defray the costs of the teaching staff providing assistance in the lab but would
actually result in a reduction in cost per student from previous semesters. To have the lab space necessary for
online testing, College Algebra students are not required to complete their lab hours three weeks of the
semester. During these three weeks, the lab is not used for learning; instead it becomes a testing lab for the five
GEP courses with the regularly scheduled teaching staff taking the role of test proctors.

As the lab is smaller than our class enrollments, testing often occurs at a time other than the normal class period.
Students are able to schedule their test appointment at a time that works well with their learning style, as well as
their academic and personal schedule, from the multiple time slots available for their course. The online test
scheduling system developed by the textbook publisher is utilized by students to schedule their testing
appointments. The system automatically sends a confirmation email after the student has scheduled an
appointment and a reminder email of the upcoming appointment the day before the scheduled time. The system
administrator can use the system to send out reminder emails for students that have not scheduled their testing
appointment as of a specified date.

Students arrive at their scheduled appointment time and are checked off the list of confirmed students for that
particular time slot. As they enter the lab, they present a blue book to a staff member who exchanges it for a
different blue book that has been pre-numbered with a computer station number. The student goes to that
particular station and logs into the online environment. Once all the students have entered the lab and are
seated, an integrity statement is read and the password for that testing session is given. Students type in the
password which then gives them access to their test which is posted within the MyMathLab environment. For
test security reasons, students are not permitted to leave the lab until the password has been changed. Students
use the blue book for their work. The blue books are either collected and reviewed by the instructor or retained
by the lab and recycled after the testing is completed. The online test questions can be multiple-choice, free
response, or a combination. The MyMathLab settings can be set to allow no, partial, or full credit for equivalent
answers. Although students can receive their test score prior to leaving the lab as the test is computer graded,
the test settings are set so that do not allow the student to review their test questions until after the testing is
completed for that particular course. If desired, a challenge week can be set to provide students the opportunity
to request specific questions to be reviewed by their instructor. Accommodations for students entitled to
extended time are easily made by setting the extended time allowed using the student setting instead of the class
setting. As each test is algorithmically generated, and question pooling can be used, instances of academic
integrity violations during testing have been dramatically reduced to the point of non-existence. During our
redesign, we found online testing in mathematics courses to be ideal for large classes as the test can include free
response questions instead of relying on a multiple-choice scantron test, students are afforded the opportunity to
know their grade as they leave the testing lab, and schedule their test to avoid time conflicts. For a class of any
size, we have found online testing is a cost effective, environmentally friendly way to test students.

Pilot Results: Was the redesign successful?
Results from the successful fall 2008 pilot caught the attention of our central administration who offered us the
opportunity to be a part of the newly created Presidents Class Size Initiative (PCSI) beginning in the summer
2009 semester. One of the goals of the PCSI was to offer students, especially first time in college students, an
improved course experience. This initiative provided additional funding for teaching staff, training, and a new
computer facility known as the Mathematics Assistance and Learning Lab (MALL). Although this funding
mitigates the cost savings, it is viewed as an investment in our students. With this initiative, we are able to
provide a state of the art distraction free environment enabling students to remain on task and engaged in
learning. Instructors work one-on one with students in addition to the assistance they will receive from
undergraduate peer tutors and graduate students.


Create a sense of community:
One of the goals of the PCSI was to create a sense of community among the students enrolled in College
Algebra. The large class size was necessary due to the high enrollment numbers, so a small within large
environment was created. Students registered in sections of 19 students which formed a group or community
for the in-class assignments and a “virtual community” when these same students were in the online
environment. During the one hour per week scheduled class time, between 16 and 20 of these 19 student
sections meet together in a large lecture hall. The students in each of the sections sit together in the lecture hall
so they can work within their section on class activities which often involve cooperative learning. As the
students in each section get to know each other, they create a learning community. A learning community
encourages increased communication which has been shown to have an important role in learning and
understanding mathematics (Knuth & Peressini, 2001).

Pass Rate:

Another goal of the initiative was to increase student learning and success rate. One measure of student
learning outcomes is course pass rates. Our course redesign took place in fall 2008 with modifications for the
PCSI in fall 2009. Both fall 2008 and 2009 had significantly higher pass rates when comparing them to fall
semesters in previous years. The highest pass rate, fall 2009, had the benefit of the PCSI modifications.
                                                                                                 Historical Success Rate - Fall Semesters for Classes with One Lecture Hour


                                                                                                                                                                   78%
                                                                                                                                                    74%
                                                       Percentage of Students Earning a C or Higher




                                                                                                                         51%          50%


                                                                                                               35%




                                                                                                         Fall 2005    Fall 2006    Fall 2007      Fall 2008     Fall 2009




During summer 2009, the students enrolled in the redesigned PCSI section had a significantly higher pass rate
of 75.3%, when compared to the students enrolled in the traditional section which had a 61.5% pass rate. The
pass rate for the redesigned PCSI sections during the fall 2009 semester was 78.4% compared to the 72.3% pass
rate of the sections taught in a traditional method.


                                                                                                                           Student Success Rate
        Percentage of Students Earning a C or Higher




                                                                                                                                                                   78.40%
                                                                                                                     75.30%                       72.30%

                                                                                                      61.50%
Final exam scores can also be used as a measure of learning outcomes. The median and mean final exam scores
for the summer 2009 and fall 2009 semesters were calculated for the sections taught in the traditional method as
well as those students enrolled in the redesigned sections with the results shown in the following table.

           Summer 2009          Summer 2009          Fall 2009             Fall 2009
           Traditional          Redesigned           Traditional           Redesigned
 Final
           Section n=194        Sections n=136       Sections n=770        Sections n=1043
Exam
           students who         students who         students who          students who
Scores
           took the final       took the final       took the final        took the final
           exam                 exam                 exam                  exam
Median
Score            73.9%                80.0%                 82.9%                82.9%
Mean
Score            71.0%                77.5%                 78.2%                80.4%
Std Dev
                 15.9%                13.9%                 16.3%                14.0%
Grade Distribution:

As shown in the following charts, the students enrolled in the redesigned sections had a significantly higher
(p-value < 0.01) proportion of A and B grades when compared to students enrolled in the traditional sections. It
was noted that the withdrawal rate was higher in the redesigned sections when compared to the withdrawal rate
of the traditional section. We believe that part of this difference can be attributed to the attendance and in-class
assessments that were required in the initiative sections, but were not required in the traditional sections.


                         Summer 2009          Summer 2009    Fall 2009                   Fall 2009
Course Grade             Traditional          Redesigned     Traditional                 Redesigned
Distribution             Section              Sections       Sections                    Sections
                         n=226 students       n=162 students n=851 students              n=1174 students
      A or B                 38.1%                 56.8%           55.5%                        62.0%
        C                    23.5%                 18.5%           16.8%                        16.4%
F or NC (no credit)          28.7%                 12.4%           23.2%                        15.4%
        W                     9.7%                 12.3%            4.6%                         6.2%


Participation Grades:

Students in the redesigned sections earned grades for class participation and lab attendance. Analysis was
completed to determine if the inclusion of participation and lab grades was resulting in a falsely elevated pass
rate. Course averages were calculated for all students enrolled in the initiative sections using the same grade
weighting system used in the traditional sections which did not include the participation grades. The overall
pass rate was the same as when the averages were calculated with the participation grades included for the
summer 2009 semester and had between a 0.46% and 1.80% difference depending on the instructor, for the fall
2009 semester.
Individualized Instruction:

Another goal of the redesign was to provide more individualized instruction. During the fall 2009 semester,
student responses during an in-class survey showed that 90.79% of the students felt the redesigned course
offered at least as much instructional interaction as their other courses with 60.97% indicating that there was
considerably more interaction when compared to their other courses.

Conclusion and Future Work
At a time of budget cuts and enrollment growth, a course redesign to improve student success rates and decrease
costs seemed like a very advantageous idea. In higher education we encourage an individualized faculty by
providing autonomy in course development and delivery and we standardize the student learning experience
with little accommodation for student learning needs and abilities. Twigg argues that we need to do just the
opposite: individualize student learning and standardize faculty practice (Twigg, 2003). Course redesign
provided the impetus for a new, refreshed way to address diversity of student learning needs. The redesign
allowed students to individualize their learning as they were able to choose when they accessed course materials
for three of their four required hours, and which of the available instructional opportunities they wanted to
utilize. This flexibility accommodated the individual learning styles, abilities, and scheduling needs of our
students. It is felt that the opportunities for learning, progress monitoring, and reinforcing course concepts
provided by the redesigned course were instrumental in addressing the high withdrawal rate mentioned above,
thus increasing the overall success rate of the course. In particular, the weekly lab requirement, progress
monitoring, and access to instructional personnel helped the struggling student to be successful in the course
which contributed to an improved pass rate of the course. Active learning has been shown to be an effective
method of improving learning outcomes (Prince, 2004, Twigg, 2003). We feel that requiring students to be
active learners, spending more time doing math instead of watching math, contributed to the course quality and
improved learning outcomes.

The PCSI College Algebra program is in the third semester of offering students an improved course experience.
A mentoring program was started during the fall 2009 semester. The GTA mentors assist faculty with weekly
communication to all enrolled students and one-on-one meetings with identified at-risk students. The mentors
also offer weekly seminar sessions and test reviews prior to each test. Additionally, bimonthly tutor training
sessions have taken place in an effort to improve the quality of assistance provided by the tutors to our students.
The formation of the PCSI College Algebra program, a direct result of the course redesign, is producing
favorable results in terms of course offerings, number of students advised, student-faculty ratios, and student
learning outcomes. Success as measured by retention rates, graduation rates, and success in later courses are
not able to be measured this soon after implementation but they will be studied in the future.

Looking forward, we will offer Intermediate Algebra in the same format as College Algebra in the fall 2010
semester and pending approval and construction of an extension of the MALL, we will begin to offer
Precalculus in this format during the spring 2011 semester.

References
Carol A. Twigg, (2003). Improving learning and reducing costs: new models for online learning. Educause
Review. September/October 2003.
Knuth, E., & Peressini, D. (2001). A theoretical framework for examining discourse in
mathematics classrooms. Focus on Learning Problems in Mathematics. 23(2/3), 5-22.

Prince, Michael. (2004). Does active learning work? A review of the research. Journal of Engineering
Education. July 2004.

				
DOCUMENT INFO